Sensory context dependent remapping of proprioceptive targets into a gaze-centred reference frame requires additional processing of visual information during movement planning
AbstractMovements programmed to proprioceptive targets can be coded in both gaze-centred and intrinsic sensory reference frames (Pouget et al., 2002; Bernier et al., 2007). It is unclear however what factors contribute to the preferential use of one reference frame over another. The present study examined if the sensory information used to identify the spatial location of proprioceptive targets affects how they are mapped. Ten participants performed reaching movements to proprioceptive and visual targets identified by either an auditory, or a vibrotactile cue. Fixation positions were systematically altered to determine the extent that movements were mapped in gaze-centred coordinates. Results revealed that auditory cues facilitated the use of a visual reference frame as gaze position biased reaching endpoint deviations for both types of auditory-cued targets (either visual or proprioceptive) to a greater extent than vibrotactile-cued proprioceptive targets. To determine if the noted gaze-centred remapping of auditory-cued proprioceptive targets influenced the cortical processing of visual information, a second experiment was conducted. The cortical response to a visual flash was recorded by electroencephalography as 10 new participants planned reaches to both auditory–cued proprioceptive, and auditory-cued visual targets. Results indicated that visually-evoked potentials were greater for movements planned to proprioceptive targets compared to those planned to visual targets. This suggests that the remapping of proprioceptive targets into a visual reference frame requires a greater processing of visual information during movement planning. Overall, these experiments provide evidence for the importance of sensory context in the mapping of proprioceptive targets prior to goal directed action.
Acknowledgments: NSERC, CAMPUS FRANCE
Psychomotor Learning Abstracts