No evidence supporting perceptual averaging for auditory pro- and antisaccades

  • Jennifer Campbell School of Kinesiology, Western University
  • Caitlin Gillen School of Kinesiology, Western University
  • Matthew Heath Graduate Program in Neuroscience, Western University


The visual antisaccade task requires the top-down and two-component process of inhibiting a stimulus-driven prosaccade and the visual inversion of a target's location to mirror-symmetrical space. Notably, recent work by our group (Gillen and Heath 2014: Vis Res; Heath et al. 2015: J Vis) has shown that visual vector inversion is perception-based and governed via a statistical summary representation (SSR). In particular, showing antisaccade – but not prosaccade – amplitudes are biased in the direction of the most frequently presented visual target within a stimulus-set. The present investigation was designed to examine whether a SSR influences auditory-based pro- and antisaccades, and thus determine whether the SSR reflects a modality-independent or modality-dependent characteristic of antisaccades. To that end, participants completed pro- and antisaccades to acoustic targets (i.e., 50 ms burst, 70 dBA) in which eccentricity was defined by distinct frequency spectrums (i.e., 10.5° target = pink noise; 15.5° target = white noise; 20.5° target = blue noise). Moreover, responses were completed in separate blocks wherein each target eccentricity was presented the same number of times (i.e., control condition), and when 10.5° (i.e., proximal-weighting condition) and 20.5° (i.e., distal-weighting condition) target eccentricities were presented five times as often. As expected, prosaccades yielded shorter and more accurate endpoints than antisaccades; however, the general difference between pro- and antisaccades was not modulated by the different weighting conditions. Thus, auditory antisaccades do not elicit a SSR comparable to their visual counterparts and we propose that such a result indicates that the SSR is a modality-dependent phenomenon of oculomotor control.

Acknowledgments: Supported by a Natural Sciences and Engineering Research Council of Canada Discovery Grant.