TAS LAB Trustworthy AI and Autonomous Systems Laboratory

TAS Lab Presents Wearable Navigation Concept for Visually Impaired Individuals at Nanchang Health Expo

TAS Lab Presents Wearable Navigation Concept for Visually Impaired Individuals at Nanchang Health Expo

NANCHANG – June 6, 2025 – At the Nanchang Health Expo today, the TAS Lab presented an innovative concept for a wearable navigation system designed to enhance independence for visually impaired individuals. The goal is to create a system that offers a deeper, more interactive understanding of the user’s surroundings to significantly improve mobility and safety.

PolyU Stage at Nanchang Health Expo

The proposed design is built on a sophisticated technology stack. It integrates multi-sensor fusion, using IMU, GNSS, and depth cameras, to achieve precise 3D environmental mapping and pathfinding. An onboard AI vision engine would identify dynamic and static obstacles. Furthermore, it would allow users to find specific objects using simple voice commands. Interaction is envisioned through a Large Language Model, enabling voice-based Q&A and detailed environmental descriptions.

Prof. Wen and Xiangru at Nanchang Health Expo

A core innovation of the concept is its dual-feedback system, which leverages the complementary strengths of haptic and auditory cues.

Xiangru with our porject

This synergy allows users to perform reflexive safety actions through touch while gaining a deeper environmental understanding through sound, creating a much safer and more comprehensive experience than either modality could provide alone.

Previous post
TAS Lab Holds Information Exchange Session with ATAL Company to Foster Industry-Academia Collaboration
Next post
TAS Lab Staff Visit Honor Device Co., Ltd