Résumé
Based on findings that augmented sensory inputs can enhance motor performance, this study investigated how brief auditory and vibrotactile stimuli influenced planning and control of one and two-target movements. Specifically, augmented sensory cues and feedback were manipulated at movement initiation (as the go-signal) and at target one acquisition (as feedback). Eleven young adults (20-32years-old) used a custom stylus to perform goal-directed reaching movements to one (OT) or two targets (TT) displayed on a touchscreen. Participants completed 20 trials per condition in a counterbalanced order. Each block consisted of OT or TT movements and one sensory condition (Auditory-Auditory-AA; Auditory-Vibrotactile-AV; Vibrotactile-Auditory-VA;Vibrotactile-Vibrotactile-VV). Stylus position was recorded using optical motion capture at 400Hz. Data were analyzed using a 2 Task by 2 Cue Modality by 2 Feedback Modality repeated measures ANOVA. A significant main effect of Task, F(1,9)=5.98, p<0.03, η;p²=.40, revealed shorter RTs in the TT task. A main effect of Cue, F(1,9)=6.33, p<0.03, η;p²=.41, revealed auditory cues led to shorter RTs compared to vibrotactile. A main effect of Feedback, F(1,9)=6.14, p<0.03, η;p²=.40), revealed vibrotactile feedback led to shorter RTs compared to auditory. No statistically significant differences were found for movement time. Thus, auditory stimuli decreased RT when presented as a cue, whereas vibrotactile stimuli reduced RT when presented as feedback. These findings underscore the importance of optimizing cue-feedback pairings to support efficient performance, with applications in human-computer interaction, including assistive technologies and warning systems in semi-autonomous vehicles.