Tech Focus: Raising the Bar on Autonomous Vehicle Safety

The fully autonomous vehicles of the not-so-distant future promise tremendous gains in automotive safety and transportation efficiency.

In this guest commentary, Thomas Goetzl from Keysight Technologies shares his insights on how automotive OEMs must move beyond contemporary levels of vehicle autonomy to fulfill this promise.

Keysight's Radar Scene Emulator (RSE) closes the gap between software simulation and roadway testing, and training ADAS and autonomous driving algorithms to real-world conditions.

Keysight’s Radar Scene Emulator (RSE) closes the gap between software simulation and roadway testing, and training ADAS and autonomous driving algorithms to real-world conditions.

SAE International (formerly the Society of Automotive Engineers) defines six levels of vehicle autonomy, with Level 0 representing fully manual and Level 5 representing fully autonomous.

Today’s most advanced autonomous vehicle systems rate only Level 3, which means they are capable of making some decisions such as acceleration or braking without human intervention.

“In order to make the leap to the tremendous gains in automotive safety and transportation efficiency that fully autonomous vehicles promise, OEMs will need to overcome a unique set of challenges for testing automotive radar sensors in advanced driver assistance systems (ADAS) and autonomous driving systems, as well as developing new methodologies for training algorithms that conventional solutions are ill-equipped to address,” says Thomas Goetzl, vice president of automotive and energy solutions at Keysight Technologies.

Getting from Level 3 to Level 5 will require many breakthroughs, including closing the gap between software simulation and roadway testing, and training ADAS and autonomous driving algorithms to real-world conditions.

Keysight’s latest innovation, the Radar Scene Emulator (RSE), goes a long way toward bridging these gaps.

Software simulation plays an important role in autonomous vehicle development.

Simulating environments through software can help validate the capabilities of ADAS and autonomous driving systems.

But simulation cannot fully replicate real-world driving conditions or the potential for imperfect sensor response — something that fully autonomous vehicles will inevitably have to contend with.

OEMs rely on road testing to validate ADAS and autonomous driving systems prior to bringing them to market.

While road testing is and will continue to be a vital and necessary component of the development process, it is time-consuming, costly, and difficult to repeat specifically in the area of controlling environmental conditions.

Relying on road testing alone to develop vehicles reliable enough to navigate urban and rural roadways safely 100% of the time would take decades.

In order for development to occur in a realistic timeframe, training algorithms are needed.

Validating radar-based autonomous driving algorithms is a crucial task.

The sensors capture information about road and traffic conditions and feed that information to processors and algorithms that enable it to make decisions about how the vehicle should respond to any given situation.

Without proper training, autonomous vehicles could make decisions that undermine driver, passenger, or pedestrian safety.

About the Author: Thomas Goetzl is vice president and general manager for Automotive and Energy Solutions (AES) of the Electronic Industrial Solutions Group at Keysight Technologies as well as managing director of Keysight Technologies in Germany.

About the Author: Thomas Goetzl is vice president and general manager for Automotive and Energy Solutions (AES) of the Electronic Industrial Solutions Group at Keysight Technologies as well as managing director of Keysight Technologies in Germany.

Just as people become better drivers with time and experience, autonomous driving systems improve their ability to deal with real-world driving conditions with time and training.

And achieving Level 5 autonomy will require complex systems that exceed the abilities of the best human drivers.

Premature road testing of unproven ADAS and autonomous driving systems also creates risks. OEMs need the ability to emulate real-world scenarios that enable validation of actual sensors, electronic control unit code, artificial intelligence, and more.

Current lab-based simulation solutions do not provide a true approximation of real-world driving scenarios.

They have a limited field of view and cannot resolve objects at distances of less than 4 meters.

Some of these systems use multiple radar target simulators, each presenting point targets to radar sensors and emulating horizontal and vertical position by mechanically moving antennas around.

This mechanical automation slows overall test time.

Other solutions create a wall of antennas with only a few target simulators, enabling an object to appear anywhere in the scene, but not concurrently.

In a static or quasi-static environment, this approach enables testing with a handful of targets moving laterally at speeds that are limited by the speed of robotic arms.

Current simulators can emulate a maximum of just 32 objects – including vehicles, infrastructure, pedestrians, obstacles, and other objects.

This is far fewer objects than a vehicle traveling on the road may encounter at any given time.

Testing radar sensors against a limited number of objects delivers an incomplete view of driving scenarios and masks the complexity of the real world.

To advance autonomous driving technology to Level 4 and Level 5 autonomy, automotive OEMs need solutions capable of rendering more objects faster and at closer distances.

To help bridge these gaps, Keysight developed a proprietary scalable emulation screen that combines hundreds of miniature target radar simulators and can emulate up to 512 objects at distances as close as 1.5 meters.

The result is a deterministic real-world environment for lab testing complex scenes that previously could only be tested on the road.

Tags: , , , , , ,

2 Responses to “Tech Focus: Raising the Bar on Autonomous Vehicle Safety”

  1. Sam Spiegtzel says:

    Interesting. When will we reach level 5 vehicle autonomy?

    • tech4tea says:

      That’s a good question. There are many guestimates out there on when we’d hit Level 5. Personally, I’d hazard to speculate that optimistically, we’re probably looking at around 2040; but realistically, we’re more likely only going to see Level 5 driverless cars around 2050. Let me ask Thomas Goetzl, the author of this commentary, for his opinion. 🙂

Leave a Reply