TAS Lab Presents Wearable Navigation Concept for Visually Impaired Individuals at Nanchang Health Expo
TAS Lab Presents Wearable Navigation Concept for Visually Impaired Individuals at Nanchang Health Expo
NANCHANG – June 6, 2025 – At the Nanchang Health Expo today, the TAS Lab presented an innovative concept for a wearable navigation system designed to enhance independence for visually impaired individuals. The goal is to create a system that offers a deeper, more interactive understanding of the user’s surroundings to significantly improve mobility and safety.

The proposed design is built on a sophisticated technology stack. It integrates multi-sensor fusion, using IMU, GNSS, and depth cameras, to achieve precise 3D environmental mapping and pathfinding. An onboard AI vision engine would identify dynamic and static obstacles. Furthermore, it would allow users to find specific objects using simple voice commands. Interaction is envisioned through a Large Language Model, enabling voice-based Q&A and detailed environmental descriptions.

A core innovation of the concept is its dual-feedback system, which leverages the complementary strengths of haptic and auditory cues.
- Haptic Feedback: Navigation cues would be sent as intuitive vibrations via a low-latency haptic wearable device. This method excels at delivering fast, ‘what-to-do’ commands, such as an urgent alert to dodge an obstacle, ensuring immediate physical safety with minimal cognitive load.
- Auditory Feedback: In contrast, auditory feedback provides the crucial ‘what-is-there’ and ‘why’ context. An AI-powered voice chat can describe the environment in detail, explaining the nature and location of obstacles and other complex information.

This synergy allows users to perform reflexive safety actions through touch while gaining a deeper environmental understanding through sound, creating a much safer and more comprehensive experience than either modality could provide alone.