BLOG

2023 FFG-Success-Story: THE EYES OF TOMORROW: FORWARD-LOOKING SAFETY IN ROAD TRAFFIC

CURRENT DRIVER ASSISTANCE SYSTEMS MAINLY ADDRESS PREDICTABLE SCENARIOS. BUT WHAT IF A VRU BEHAVES UNPREDICTABLY?

A child suddenly darts onto the street, a cyclist seemingly materializes out of thin air, a pedestrian abruptly changes direction at the crosswalk – the unexpected scenarios in real-world traffic are numerous. This underscores the critical need to equip self-driving vehicles with technology capable of promptly recognizing and evaluating such situations in real-time, ideally faster than a human ever could.

Advancing 360° Perception in Vehicles

Vehicles equipped with LiDAR sensors already have the ability to fully perceive their surroundings (360° perception).

Further advancements are necessary to improve safety-relevant predictions, particularly in urban environments.

Achieving significant enhancements in field of view and resolution will require innovations across all components of LiDAR sensors: a hybrid laser source for shorter, more intense pulses at higher repetition rates, a redesigned mirror and packaging with increased surface area and deflection angles, a receiver featuring a larger detector array, and improved pulse detection and time measurement for greater efficiency and accuracy.

3D object detection and AI-supported LiDAR data

The resulting point clouds feed into the 3D object detection and classification process. Objects undergo segmentation in a secure, secondary data evaluation phase, where deep learning algorithms identify and categorize them into vehicles, pedestrians, cyclists, stationary objects, and more. These LiDAR data are then merged with radar and camera data through sensor fusion.

The hardware for data evaluation and sensor fusion includes a computer with standardized sensor interfaces for LiDAR, radar, and ultrasonic sensors, along with cameras and network connectivity. This setup enables comprehensive data collection from all sensors during test drives. The collected data are annotated and utilized for training, testing, and evaluating object classifiers and algorithms. Plans are in place to publicly release resulting datasets to support further research, with access details outlined in a Data Management Plan.

Moreover, the sensor fusion hardware is intended as a development platform for future research projects. Leveraging the Open Simulation Interface, fused objects, free space information, and more are passed to scene understanding algorithms. These algorithms track objects and predict their behavior, contributing to predictive hazard assessment efforts.

Use Cases and Results

In support of simulation and validation efforts for driver assistance and autonomous systems in urban environments, new test and reference systems were developed based on high-resolution LiDAR sensors. Lastly, various selected use cases, spanning road and rail vehicles in urban settings, as well as agricultural applications, were implemented to practically showcase the relevance and effectiveness of this approach.

Conclusion

Vulnerable Road Users are road participants who require special protection. Additionally, these so-called VRUs often behave unpredictably. For this purpose, intelligent sensor technology is needed that perceives its surroundings in real-time, identifies potential hazards, and thus enables automated driving in urban traffic. The iLIDS4SAM project contributes significantly to this goal.

Project coordination (Story)

Thomas Gölles, Dr

Senior Researcher

Autonomous Systems

 

Project coordination

Infineon Technologies Austria AG

Project Partners

Silicon Austria Labs GmbH

Virtual Vehicle Research GmbH

AVL List GmbH

Technische Universität Graz

ams-OSRAM AG

EV Group E.Thallner GmbH

FH Campus Wien Forschungs- und Entwicklungs GmbH

RIEGL Research & Defense GmbH

IDeAS GmbH & Co KG

TTTech Auto AG

The project was funded by the program “IKT der Zukunft” of the Austrian Federal Ministry for Climate Action (BMK).
The program is managed by the Austrian Research Promotion Agency (FFG).

 

 

https://projekte.ffg.at/projekt/3759710