Research Topics
Our research aims to build algorithm foundations for embodied AI that enable trustworthy perception, navigation, and control of autonomous systems. We develop practical embodied AI-driven autonomous systems — including drones, intelligent vehicles, and legged/humanoid robots — with end-to-end learning and safety certification capabilities, enabling them to perceive, reason, and interact with the physical world safely and reliably for the future society. Our work spans large AI models for autonomous systems, foundation models and vision-language-action models for robotic perception and control, AI-enabled multi-sensor fusion, and software-hardware co-design for efficient embodied AI systems.
Research Directions:
1) 3D LiDAR Aided GNSS Positioning — AI-driven GNSS positioning (RTK, PPP, PPP-RTK), 3D LiDAR aided NLOS/multipath mitigation, multi-sensor fusion for robust urban navigation;
2) Safety-certifiable Multi-Sensor Fusion — safety-certifiable AI for autonomous navigation, AI-enabled multi-sensor fusion (LiDAR/Camera/IMU/GNSS), integrity monitoring and navigation-control joint optimization;
3) End-to-End and Safety-Certifiable Autonomous Vehicles — end-to-end learning for self-driving, safety certification for logistics applications, V2X-assisted connected autonomous driving;
4) Embodied AI for Humanoid/Legged Robotics — large AI models and vision-language-action models for robotic perception and control, bio-inspired embodied intelligence, multimodal learning for humanoid/legged robots;
5) Embodied Drones for City Maintenance and Manipulation — intelligent drones and UAV swarm systems, aerial manipulation for urban infrastructure, software-hardware co-design for efficient embodied AI drone systems.
1) 3D LiDAR Aided GNSS Positioning — AI-driven GNSS positioning (RTK, PPP, PPP-RTK), 3D LiDAR aided NLOS/multipath mitigation, multi-sensor fusion for robust urban navigation;
2) Safety-certifiable Multi-Sensor Fusion — safety-certifiable AI for autonomous navigation, AI-enabled multi-sensor fusion (LiDAR/Camera/IMU/GNSS), integrity monitoring and navigation-control joint optimization;
3) End-to-End and Safety-Certifiable Autonomous Vehicles — end-to-end learning for self-driving, safety certification for logistics applications, V2X-assisted connected autonomous driving;
4) Embodied AI for Humanoid/Legged Robotics — large AI models and vision-language-action models for robotic perception and control, bio-inspired embodied intelligence, multimodal learning for humanoid/legged robots;
5) Embodied Drones for City Maintenance and Manipulation — intelligent drones and UAV swarm systems, aerial manipulation for urban infrastructure, software-hardware co-design for efficient embodied AI drone systems.
🛰️
3D LiDAR Aided GNSS Positioning for Robotics Navigation
› Find out more🔒
Safety-certifiable Multi-Sensor Fusion for Robotic Navigation in Urban Scenes
› Find out more🚗
End-to-End and Safety-Certifiable Autonomous Vehicles for Logistics Applications
› Find out more🤖
Embodied AI for Humanoid/Legged Robotics
› Find out more🚁