Multi-Sensory Virtual Reality

Sungchul Jung | Rob Lindeman

2018 – current

Supporting perceptual-cognitive tasks is an important part of our daily lives. We use rich, multi-sensory feedback through sight, sound, touch, smell, and taste to support better perceptual-cognitive things we do, such as sports, cooking, and searching for a location. Similarly, physical face-to-face collaboration with someone gives a higher-quality experience compared to mediated communication options, such as a phone- or video-based chat. Same with real life, the demand for perceptual-cognitive scenarios exist in serious VR simulations such as surgical, safety training systems, or negotiation between the users in VR. However, in contrast to real life, VR simulations are typically limited to visual and auditory cues, while sometimes adding simple tactile feedback. The impact of richer, multi-sensory feedback on decision-making tasks in VR is a critical area of research.

In this project, we build a multi-sensory feedback VR platform for single-user and multi-user. Our system produces multi-sensory stimuli, namely visuals, audio, two types of tactile (floor vibration and wind), and smell. We are investigating the impact of the multi-sensory feedback in terms of perceptual and cognitive response for the human subject in virtual space.