This paper reports on a study on the perception and rendering of distance in multimodal virtual environments. A model for binaural sound synthesis is discussed, and its integration in a real-time system with motion tracking and visual rendering is presented. Results from a validation experiment show that the model effectively simulates relevant auditory cues for distance perception in dynamic conditions. The model is then used in a subsequent experiment on the perception of egocentric distance. The design and preliminary result from this experiment are discussed.
Real-time auditory-visual distance rendering for a virtual reaching task
MION, LUCA;AVANZINI, FEDERICO;
2007
Abstract
This paper reports on a study on the perception and rendering of distance in multimodal virtual environments. A model for binaural sound synthesis is discussed, and its integration in a real-time system with motion tracking and visual rendering is presented. Results from a validation experiment show that the model effectively simulates relevant auditory cues for distance perception in dynamic conditions. The model is then used in a subsequent experiment on the perception of egocentric distance. The design and preliminary result from this experiment are discussed.File in questo prodotto:
| File | Dimensione | Formato | |
|---|---|---|---|
|
mion_vrst07.pdf
Accesso riservato
Descrizione: Paper
Tipologia:
Published (Publisher's Version of Record)
Licenza:
Accesso privato - non pubblico
Dimensione
273.11 kB
Formato
Adobe PDF
|
273.11 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.




