Auditorily-Induced Vection in VR

We often rely on our audition when evaluating object motion, like, for example, the velocity of an approaching car. Can acoustic cues provide us with information about our motion, real or illusory? While visually-induced illusory self-motion (termed vection) has been extensively studied over more than a century, research on auditory vection cues has received only minor attention. This project aims to re-create sound examples and experimental setups for auditorily-induced vection, i.e., perception of self-motion based on existing literature. The project benefits from understanding auditory perception and VR and sound design for immersive setups such as binaural and multichannel setups.

Professor of Sound in VR

My research interests include the virtual- and psychoacoustics, physical modeling and the design of virtual worlds.