Assisted and Automated Driving Research Platform
A state-of-the-art platform supporting research across perception, planning, control, connectivity, and remote operation. It enables comprehensive experimentation across the full driving stack, including perception, decision-making, and control. The platform supports sensor fusion from LiDAR, radar, and cameras, along with real-time object detection and tracking for situational awareness. Advanced planning and control modules facilitate research in motion prediction, trajectory generation, and adaptive vehicle control for both assisted and fully automated driving scenarios. Connectivity features allow for the exploration of cooperative driving concepts using Vehicle-to-Everything (V2X) communication, while the tele-driving module supports studies on remote vehicle operation and supervision. Together, these capabilities make the platform a powerful testbed for innovation in autonomous mobility, safety, and human–vehicle interaction research.