The impact of response complexity and cue modality when performing a choice eye-hand coordination task

Abstract

Actions aid our ability to communicate. Actions that support communication range in complexity and the motor plan may incorporate information from multiple sensory stimuli. The present study sought to understand how individuals plan and execute movements related to communication. Potential targets were presented on a touchscreen monitor and occupied 2 degrees of visual angle. An auditory (animal sound) or visual (animal picture) cue was presented along with the two pictures of animals located 16.5 degrees of visual angle to the left and right of the central fixation point. Participants (n=24, Meanage=25.1±4.42 years) sat with their forehead resting in an EyeLink1000plus eye-tracker configured in tower arm mount (SR Research Ltd., Ottawa, ON; 500Hz collection frequency). Participants were asked to look, press a key, or point, to the correct image on the touchscreen monitor (CNE, Gainsville, FL) as quickly and accurately as possible. Saccade and hand reaction times (RT) were analyzed. In all conditions, the visual stimulus produced a shorter RT than the auditory stimulus, which is consistent with vision being the preferred modality for a target localization task. Saccade RT was longest and most variable in the eyes-only condition, suggesting eye movements alone were more challenging and novel. Hand RT was longer and more variable for the key-press condition, which is thought to indicate better integration of peripheral vision during pointing movements. Compared to previous studies, participants were better able to prepare a motor plan for a more complex eye-hand coordinated task because the target was located in peripheral vision.

Acknowledgments: Funding for this research was provided by Research Manitoba, the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Canadian Foundation for Innovation.