Skip to content

Latest commit

 

History

History
136 lines (102 loc) · 5.11 KB

README.md

File metadata and controls

136 lines (102 loc) · 5.11 KB

Haptic Vision

Enhancing Navigation for the Visually Impaired through Virtual Reality and Haptic Feedback

Haptic Vision is an innovative assistive technology designed to improve navigation and spatial awareness for individuals with visual impairments. By integrating virtual reality (VR) and haptic feedback, Haptic Vision aims to enhance independence and quality of life for the visually impaired population. project overview

Table of Contents

Overview

Visual impairment affects approximately 253 million people worldwide (WHO, 2024), presenting significant challenges in daily activities such as navigation and environmental interaction. Haptic Vision addresses these challenges by leveraging cutting-edge technologies to create a more intuitive and informative navigational experience. BCI headband

Features

  • Environmental mapping and 3D modeling of surroundings
  • AI-powered object detection for common items (e.g., chairs, tables)
  • Distance approximation using ray casting
  • Haptic feedback with variable frequency (15-30 Hz) based on object proximity
  • Complementary audio cues for enhanced awareness

Technology Stack

  • Hardware:

    • VR Headset: HTC Vive Pro Eye
    • Haptic Feedback: HaptX gloves
    • Spatial Tracking: Lighthouse system
  • Software:

    • Development Platform: Unity 19.4.31f
    • SDKs:
      • HaptX 2.0.0 beta 8
      • SRworks
  • Operating System: Windows 11

Installation

  1. Ensure you have Unity 19.4.31f installed on your Windows 11 system.
  2. Clone this repository to your local machine.
  3. Open the project in Unity.
  4. Install the required SDKs:
    • HaptX 2.0.0 beta 8
    • SRworks
  5. Connect the HTC Vive Pro Eye headset and HaptX gloves to your system.
  6. Set up the Lighthouse tracking system according to the manufacturer's instructions.
  7. Build the project in Unity for your VR platform.

Usage

  1. Put on the HTC Vive Pro Eye headset and HaptX gloves.
  2. Launch the Haptic Vision application.
  3. The system will automatically start mapping your environment.
  4. Move your hands to detect objects in your surroundings:
    • As your hands approach objects, you'll feel vibrations through the HaptX gloves.
    • The intensity of the vibrations increases as you get closer to objects.
    • Audio cues will provide additional information about detected objects.
  5. Use the haptic and audio feedback to navigate your environment safely.

Development Process

  1. Problem Definition: Identified specific challenges faced by visually impaired individuals in navigation.
  2. Solution Ideation: Collaborated on potential approaches, focusing on haptic feedback as the primary modality.
  3. Prototype Development: Integrated VR and haptic technologies using Unity, implementing object detection and distance approximation algorithms.
  4. Testing and Refinement: Conducted user testing with visually impaired individuals to gather feedback and improve the system.

Challenges and Solutions

Challenge Solution
SRworks compatibility Used Unity v19.1 as recommended by VIVE mentors
Hand tracking with SRworks Opted for fixed orientation and adjusted hand position for mesh interaction
Distance approximation accuracy Cast hands a few units in front of their actual position for smoother interaction
Limited testing devices Focused on frequency adjustment with constant amplitude for haptic feedback

Future Work

  • Enabling audio feedback
  • Implementing active scanning with an on/off feature
  • Developing a smaller, more discreet headset
  • Improving vision through XR glasses recalibration
  • Exploring waypoint guidance functionality

Team

project overview

This project was made possible by the dedicated efforts of the Haptic Vision team. Christine, Leon, Winny, Malcolm, and Kyle

Demo and Pitch

License

This project is licensed under the MIT License. See the LICENSE file for details.

Acknowledgements

We would like to thank:

  • The VIVE team for their support and guidance
  • Our visually impaired tester for their valuable feedback (Chris MacNally)
  • The Unity and HaptX communities for their resources and support