TAS LAB Trustworthy AI and Autonomous Systems Laboratory

🔒 Safety-certifiable Multi-Sensor Fusion for Robotic Navigation in Urban Scenes

The visual/LiDAR SLAM methods are challenged in complex urban scenarios, especially when safety certification is required for autonomous systems. In this project, we aim to study the mechanism of the impacts caused by dynamic scenarios on the visual/LiDAR SLAM methods, and develop safety-certifiable navigation algorithms that can quantify and guarantee the reliability of localization results. We try to answer the questions of how dynamic objects affect the state estimation of visual/LiDAR SLAM methods, how to improve robustness, and how to provide safety-quantifiable localization for robotics in complex urban environments.

Multi-Sensor Fusion

GNSS/LiDAR/Visual/INS Integration for Robotics Navigation

Safety-certifiable Visual Localization

Safety-certifiable Visual Localization with 3D Prior Map

← Back to all Research Directions

Recent News

Video Demonstration

Safety-quantifiable Line Feature-based Monocular Visual Localization with 3D Prior Map

Multi-sensor Integration Navigation System for Autonomous Driving

Demonstration: Low-cost Solid-state LiDAR/Inertial Based Localization with Prior Map

Presentation in ION GNSS+ 2021: A Coarse-to-Fine LiDAR-Based SLAM with Dynamic Object Removal

Presentation in ION GNSS+ 2021: Continuous GNSS-RTK Aided by LiDAR/Inertial Odometry

2025

2024

2023

2018–2022

→ Full publication list

Acknowledgement and Collaborators

This research is supported by government and industry partners, including Hong Kong Polytechnic University, Guangdong Basic and Applied Basic Research Foundation, and Huawei Technologies. We collaborate with leading research groups in multi-sensor fusion and safety-certifiable navigation.

Funding and Collaborators

Projects (3)

Safety-certified Multi-source Fusion Positioning for Autonomous Vehicles in Complex Scenarios

Innovation and Technology Commission

AI assisted inertial navigation system
AI assisted inertial navigation system

This project aims to develop a deep learning-based inertial navigation algorithm that utilizes accelerometer, gyroscope, and magnetometer data from smart wearables and smartphones to infer the user’s position and movement trajectory, while providing corresponding confidence levels

Development of an Assisted Navigation and Collision Avoidance System using AI and Location-based Service
Development of an Assisted Navigation and Collision Avoidance System using AI and Location-based Service

Abstract