Perceptual motor integration in a prediction motion task

  • Ran Zheng Faculty of Physical Education and Recreation, University of Alberta
  • Brian Maraj Faculty of Physical Education and Recreation, University of Alberta

Abstract

Many activities in our daily life require us to interact with moving objects which may become occluded during movements forcing us to make spatial and temporal estimations. Such estimations are components of Prediction Motion Tasks (PMT). Previously (Marchak, et al 2013), using a custom designed ball movement and occlusion setup on a computer screen; we demonstrated differences in mouse click versus mouse move conditions. In the present study, we further examined performance in PMTs collecting data for eye and hand movements. Five participants (M=26yrs, SD=5.6) predicted the arrival of a ball to a target on a computer touchscreen by either clicking the mouse (mouse click) or by using their index finger to track the ball from a start position to the target and touching the screen upon estimated arrival (hand tracking) following occlusion. The targets moved at 3 speeds creating three different viewing and occluded periods (0.5, 0.75 and 1 seconds). Hand movements were recorded by a 3D motion analysis system (Optotrak 3020) at 240Hz and eye movements were monitored by eye tracker (ASL 6000) at 240Hz. Reaction time, movement time and spatial error data were analyzed using a 2 (movement condition) by 3 (ball speed) repeated measures ANOVA. Results revealed that participants were more accurate when the ball speed was slower. In the hand tracking condition, reaction time for the eyes was faster than the hands and resulted in faster movement times. Results will be discussed as they relate to cognitive and clocking strategies in prediction motion task performance.