AbstractAudiovisual integration is integral to motor performance and communication. The present study utilized a two-choice reaction time (RT) task to investigate the integration of auditory and visual information, and how perceived social communication relates to motor performance. Twenty-four adults (20-27years-old; 14female/10male) sat in front of a touchscreen monitor and EyeLink 1000plus gaze-tracker (500Hz). For each trial participants were shown two visual images (2 degrees of visual angle) with an auditory cue that matched one image. Four response conditions were blocked and counter–balanced: key-press, key-release, dominant, and bilateral hand reach-to-point. Participants also completed the Broad Autism Phenotype Questionnaire as a measure of perceived social communication. Behavioral measures were analyzed using a 4 condition repeated measures ANOVA while relationships with perceived social communication were evaluated using Cohen's d. Saccade RT did not vary with condition. Saccade MT during the key-release was significantly shorter than key-press, dominant and bilateral hand reach-to-point conditions , suggesting more information is needed by the eye with increased task complexity. A significant main effect of condition was found for Hand RT but not MT. The dominant hand reach-to-point condition was significantly shorter than the key-release and bilateral reach-to-point conditions, which is consistent with knowledge of the limb improving movement planning. Aloofness and Behavioral Rigidity were not related to RT and MT of the eye or hand. The relationship between saccade MT and Pragmatic Language had small effect sizes, suggesting individuals with higher Pragmatic Language scores spent more time performing saccades to targets identified by an auditory stimulus.
Acknowledgments: Funding for this research was provided by the Natural Sciences and Engineering Research Council of Canada (NSERC) and Research Manitoba