2025
Aayush Shrestha; Joseph Malloch
Virtual Worlds Beyond Sight: Designing and Evaluating an Audio-Haptic System for Non-Visual VR Exploration Proceedings Article
In: Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2025), pp. 1–19, ACM, 2025.
Abstract | Links | BibTeX | Tags: assistive technology, haptics, navigation, spatial audio, virtual environment, VR
@inproceedings{Shrestha2025,
title = {Virtual Worlds Beyond Sight: Designing and Evaluating an Audio-Haptic System for Non-Visual VR Exploration},
author = {Aayush Shrestha and Joseph Malloch},
url = {https://dl.acm.org/doi/10.1145/3706598.3713400},
doi = {10.1145/3706598.371340},
year = {2025},
date = {2025-04-26},
urldate = {2025-04-26},
booktitle = {Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2025)},
number = {812},
pages = {1--19},
publisher = {ACM},
abstract = {Contemporary research in Virtual Reality for users who are visually impaired often employs navigation and interaction modalities that are either non-conventional, constrained by physical spaces, or both. We designed and examined a hapto-acoustic VR system that mitigates this by enabling non-visual exploration of large virtual environments using white cane simulation and walk-in place locomotion. The system features a complex urban cityscape incorporating a physical cane prototype coupled with a virtual cane for rendering surface textures, and an omnidirectional slide mill for navigation. In addition, spatialized audio is rendered based on the progression of sound through the geometry around the user. A study involving twenty sighted participants evaluated the system through three formative tasks while blindfolded to simulate absolute blindness. Participants were highly successful in completing all the tasks while effectively navigating through the environment. Our work highlights the potential for accessible, non-visual VR experiences, achievable even with minimal training and little prior exposure to VR.},
keywords = {assistive technology, haptics, navigation, spatial audio, virtual environment, VR},
pubstate = {published},
tppubtype = {inproceedings}
}
Contemporary research in Virtual Reality for users who are visually impaired often employs navigation and interaction modalities that are either non-conventional, constrained by physical spaces, or both. We designed and examined a hapto-acoustic VR system that mitigates this by enabling non-visual exploration of large virtual environments using white cane simulation and walk-in place locomotion. The system features a complex urban cityscape incorporating a physical cane prototype coupled with a virtual cane for rendering surface textures, and an omnidirectional slide mill for navigation. In addition, spatialized audio is rendered based on the progression of sound through the geometry around the user. A study involving twenty sighted participants evaluated the system through three formative tasks while blindfolded to simulate absolute blindness. Participants were highly successful in completing all the tasks while effectively navigating through the environment. Our work highlights the potential for accessible, non-visual VR experiences, achievable even with minimal training and little prior exposure to VR.
2024
Aayush Shrestha
Virtual Worlds Beyond Sight: Designing and Evaluating an Audio-Haptic System for Non-Visual VR Exploration Masters Thesis
Dalhousie University, 2024.
Abstract | BibTeX | Tags: assistive technology, haptics, navigation, spatial audio, VR
@mastersthesis{Shrestha2024,
title = {Virtual Worlds Beyond Sight: Designing and Evaluating an Audio-Haptic System for Non-Visual VR Exploration},
author = {Aayush Shrestha},
year = {2024},
date = {2024-08-09},
school = {Dalhousie University},
abstract = {Virtual Reality (VR), predominantly focusing on visuospatial renderings in its contemporary approach, has created a conservative narrative, making VR solely analogous to a mediated visual experience. While accessibility is included in the developmental phase of commercial VR applications, it is often considered an add-on, resulting in sub-par virtual experiences that often exclude visually impaired users. This research addresses these limitations by designing a hapto-acoustic VR system that leverages spatial audio and haptic feedback for sensory substitution of visual dominance in VR. A large-scale urban virtual environment (VE) was created using the Unity Game Engine, incorporating a physical cane prototype coupled with a virtual cane for interaction and an omnidirectional slide mill for navigation. A user study with 20 normally sighted participants evaluated and compared the system's effectiveness in texture differentiation and navigation tasks under two conditions: with visual cues and exclusively through audio-haptic feedback. The study results indicated that even with minimal training and limited prior VR experience, participants could navigate the environment effectively in non-visual conditions, though at the cost of increased cognitive load and error rates compared to visual conditions. The evaluation highlights the necessity for improved feedback mechanisms and suggests further validation with visually impaired users. The overall research contributes to the development of accessible VR systems through a novel white cane prototype, realistic spatial audio effects and a comprehensive evaluation demonstrating the system's potential in aiding non-visual navigation in a complex, large-scale VE while also engendering empathetic literacy among sighted users.},
keywords = {assistive technology, haptics, navigation, spatial audio, VR},
pubstate = {published},
tppubtype = {mastersthesis}
}
Virtual Reality (VR), predominantly focusing on visuospatial renderings in its contemporary approach, has created a conservative narrative, making VR solely analogous to a mediated visual experience. While accessibility is included in the developmental phase of commercial VR applications, it is often considered an add-on, resulting in sub-par virtual experiences that often exclude visually impaired users. This research addresses these limitations by designing a hapto-acoustic VR system that leverages spatial audio and haptic feedback for sensory substitution of visual dominance in VR. A large-scale urban virtual environment (VE) was created using the Unity Game Engine, incorporating a physical cane prototype coupled with a virtual cane for interaction and an omnidirectional slide mill for navigation. A user study with 20 normally sighted participants evaluated and compared the system's effectiveness in texture differentiation and navigation tasks under two conditions: with visual cues and exclusively through audio-haptic feedback. The study results indicated that even with minimal training and limited prior VR experience, participants could navigate the environment effectively in non-visual conditions, though at the cost of increased cognitive load and error rates compared to visual conditions. The evaluation highlights the necessity for improved feedback mechanisms and suggests further validation with visually impaired users. The overall research contributes to the development of accessible VR systems through a novel white cane prototype, realistic spatial audio effects and a comprehensive evaluation demonstrating the system's potential in aiding non-visual navigation in a complex, large-scale VE while also engendering empathetic literacy among sighted users.