
MIT Reality Hackathon - Semi Finalist
1st Group in history to make a Haptic XR experience with the HapTx Haptic Gloves
Haptic Vision is an assistive technology designed to enhance wayfinding and navigating spatial environments for people who are visually impaired. The technology utilizes haptic feedback and virtual reality (VR) to create more opportunities of independance and a greater quality of life for people with multiple levels of visual impairment.
Institution: Massachusetts Institute of Technology
Date: January 2023 (2.5 Day Hackathon)
Collaborators: Leon Kipkoech (Developer) , Winny Wang (UX Designer) , Kyle Diaz-Castro (UX Designer), Malcolm (Developer)
Contribution: Ideation and conceptual development of the potential activation of the HaptX Gloves for individuals with Visual Impairment, Brainstorming, User and Literature review, Unity Haptic Gloves Prototyping, User testing and feedback documentation and Visual presentation
Inspired by past project , Bridging the Lighthouse, and the study of architectural Haptic design methodology for individuals with visual imparement.
Haptic Vision is an inclusive technology that allows individuals to navigate their physical surroundings with greater ease and understanding. By utilizing sound and haptic feedback, Haptic Vision is an extension of a traditional seeing cane, enabling users to sense the presence of nearby objects and furniture. The user wears a VR headset and Haptx gloves, and as their hands approach objects like furniture, they will feel a vibration sensation as if they are physically touching the object. The closer they get to the object, the stronger and more detailed the sensation becomes, providing a clear understanding of its location and size.
Individuals with Visual Impairment face significant barriers when trying to navigate and interact with the physical world, which can lead to a lack of independence and reduced quality of life. According to WHO, there are approximately 253 million people worldwide who are visually impaired. Many common everyday tasks, such as grocery shopping, traveling, and accessing public transportation, can be difficult or even impossible for visually impaired individuals to accomplish without assistance. Currently, visually impaired individuals rely on tools such as support canes and guide dogs, but these tools have limitations.
Physical Environment 3D mapping
AI-powered object detection
Ray casting for Distance detection
Haptic feedback with variable frequency (15-30 Hz) based on object proximity
Tools Utilized:
The project was developed using Unity and the HTC Vive pro eye with Lighthouses, as HapTx Haptic Gloves hand tracking is compatible with any VR HMD that uses Lighthouses and Windows.
The following list summarizes the key features and technologies used:
Technologies HapTx Haptic Gloves HTC Vive pro eye Unity 19.4.31f
SDKs HapTx Haptic Gloves 2.0.0 beta 8 SRworks
Physical Environment and Mixed Reality Using SR works, we created a 3D model of the environment.
Hand Tracking is implemented by the use of the HapTx Haptic Gloves SDK, compatible with the same version of Unity used for SR works.
Object Detection SRworks was utilized for object detection, and the AI model provided by SR works can identify common objects such as chairs and tables.
Distance Approximation Ray casting, built into Unity, determined the distance from the hand to an object.
Haptic sensation is Based on the distance, we set the frequency and amplitude of the gloves, with a maximum frequency of 30 Hz and a minimum of 15 Hz.






This project could not have been possible without each person on the team and our mentor for the hackaton, Rui P. G. Pereira , and lastly, Chris McNally, Judge and User tester with Visual Impairment
Team :
Leon Kipkoech (Developer) , Winny Wang (UX Designer) , Kyle Diaz-Castro (UX Designer), Malcolm (Developer)