TAS LAB Trustworthy AI and Autonomous Systems Laboratory

Our Autonomous Platforms

Demonstration of our Autonomous Driving Vehicles and their onboard sensor platforms.

Introduction

An autonomous car, also known as a self-driving vehicle, is a sophisticated mode of transportation that can perceive its environment and navigate without human intervention. These vehicles employ a variety of advanced technologies to achieve safe and efficient driving, making them a significant innovation in modern transportation.

A critical aspect of autonomous vehicles is their ability to sense and localize themselves within their surroundings. This capability is essential for navigating complex environments, avoiding obstacles, and making real-time driving decisions. Accurate sensing and localization allow autonomous cars to interpret data from their surroundings and respond appropriately to dynamic conditions.

The autonomous driving vehicle operates under the comprehensive control of a CANBUS system. The host computer establishes a connection with the MCU, which is equipped with integrated ROS messaging capabilities. This integration allows the system to convert ROS messages into CAN signals, which are then transmitted to the MCU.

This architecture provides us with extensive access to the vehicle’s functionalities. We can not only relay vital velocity information but also manage gear settings, including Drive (D), Park (P), Reverse (R), and Neutral (N). Additionally, the system enables control of various lighting functions, enhancing both safety and operational efficiency. Overall, this setup ensures seamless communication between components, facilitating precise control and monitoring of the vehicle’s performance.

Sensor Platform

Currently, our lab has two autonomous vehicles deployed on the PolyU Main Campus and the PolyU-Wuxi Research Institute. Both vehicles are equipped with unique sensors, including LiDAR, cameras, and integrated GNSS/INS, for localization and navigation.

Here is the sensor suite:

Sensor Type Brand/Model Parameters
LiDAR Robosense RS-LiDAR-32 32 laser channels, 200m range, 360° horizontal FOV, 30° vertical FOV, 10Hz-20Hz scanning frequency
Cameras HikRobot Event camera 1280x720 resolution, 120dB dynamic range, 60fps frame rate, global shutter
GNSS/INS CHCNav GNSS/INS Dual-frequency GNSS receiver, integrated IMU, centimeter-level accuracy, real-time kinematic (RTK) support

ADV Demo Video

Testing

ADV in PolyU Campus

ADV in PolyU-Wuxi Research Institute

Carla Simulation Video

Team Banner

Carla Simulation

Researcher

Dr. Weisong Wen, Mr. Zhang Ziqi, Mr. Huang Feng

Previous post
Reliable UAV Perception and Perching Solutions in Urban Streets
Next post
Safety-certified Multi-source Fusion Positioning for Autonomous Vehicles in Complex Scenarios