This paper presents the results of a series of tests which examine the ability of subjects to adapt to non-individual Head Related Transfer Functions (HRTFs) in the context of auditory localization. Binaural presentations using the HRTF are currently limited in quality due to individual differences in the HRTF which are based on morphological variations between subjects. In contrast to studies involving adaptation methods for generating individualized HRTFs, this study concerns the ability of subjects to adapt to non-individual HRTFs through a short interactive training experience. Two groups were evaluated in the study, one using non-individual HRTF spectral cues with individual Inter-aural Time Difference (ITD) and a second control group using individual HRTFs. Subjects were blindfolded and the binaural rendering was using IRCAM's Spat software. Each subject passed two localizations tests separated by a 10 min training session. The training session was presented in the context of a search-game where subjects displaced a virtual source (linked to the position of the subject's hand via a tracking system) to locate various other virtual sound targets, all rendered binaurally. Results show that localization performance improved for the group using non-individual HRTFs, leading to errors comparable to the control group, while little improvement was found for the control group. This indicates that the improvement is not linked to the training session in general, but to a perceptual adaptation to non-individual HRTFs. Rapid adaptation to non-individual spectral cues has been shown to be possible using multi-modal interactions, in this case proprioceptive and auditory associations.
Contribution au colloque ou congrès : CFA-DAGA