Overview
Authors: Hyunsung Cho, Alexander Wang, Divya Kartik, Emily Liying Xie, Yukang Yan, David Lindlbauer
Publication Date: 1 January 2024
Link:
Keywords: Spatial audio, extended reality (XR), localization, ventriloquist effect, auditory perception, sound cues, Auptimize, angular discrimination error, front-back confuction, XR Interfacescontext aware computing,
Type: Peer-Reviewed Journals/White Papers
Summary
Spatial audio in Extended Reality (XR) provides users with better awareness of where virtual elements are placed, and efficiently guides them to events such as notifications, system alerts from different windows, or approaching avatars. Humans, however, are inaccurate in localizing sound cues, especially with multiple sources due to limitations in human auditory perception such as angular discrimination error and front-back confusion. This decreases the efficiency of XR interfaces because users misidentify from which XR element a sound is coming. To address this, we propose Auptimize, a novel computational approach for placing XR sound sources, which mitigates such localization errors by utilizing the ventriloquist effect.