Thomas Goetzl is vice president of automotive and energy solutions at Keysight
The fully autonomous vehicles of the not-so-distant future promise tremendous gains in automotive safety and transportation efficiency. But to fulfill this promise, automotive OEMs must move beyond contemporary levels of vehicle autonomy. Making that leap will require overcoming a unique set of challenges for testing automotive radar sensors in advanced driver assistance systems (ADAS) and autonomous driving systems, as well as developing new methodologies for training algorithms that conventional solutions are ill-equipped to address.
SAE International (formerly the Society of Automotive Engineers) defines six levels of vehicle autonomy, with Level 0 representing fully manual and Level 5 representing fully autonomous.
Today’s most advanced autonomous vehicle systems rate only Level 3, which means they are capable of making some decisions such as acceleration or braking without human intervention. Getting from Level 3 to Level 5 will require many breakthroughs, including closing the gap between software simulation and roadway testing, and training ADAS and autonomous driving algorithms to real-world conditions.
Keysight’s latest innovation, the Radar Scene Emulator (RSE), goes a long way toward bridging these gaps.
Software simulation plays an important role in autonomous vehicle development. Simulating environments through software can help validate the capabilities of ADAS and autonomous driving systems. But simulation cannot fully replicate real-world driving conditions or the potential for imperfect sensor response — something that fully autonomous vehicles will inevitably have to contend with.
OEMs rely on road testing to validate ADAS and autonomous driving systems prior to bringing them to market. While road testing is and will continue to be a vital and necessary component of the development process, it is time-consuming, costly, and difficult to repeat specifically in the area of controlling environmental conditions. Relying on road testing alone to develop vehicles reliable enough to navigate urban and rural roadways safely 100% of the time would take decades. In order for development to occur in a realistic timeframe, training algorithms are needed.
Validating radar-based autonomous driving algorithms is a crucial task. The sensors capture information about road and traffic conditions and feed that information to processors and algorithms that enable it to make decisions about how the vehicle should respond to any given situation. Without proper training, autonomous vehicles could make decisions that undermine driver, passenger, or pedestrian safety.
Just as people become better drivers with time and experience, autonomous driving systems improve their ability to deal with real-world driving conditions with time and training. And achieving Level 5 autonomy will require complex systems that exceed the abilities of the best human drivers.
Premature road testing of unproven ADAS and autonomous driving systems also creates risks. OEMs need the ability to emulate real-world scenarios that enable validation of actual sensors, electronic control unit code, artificial intelligence, and more.
Current lab-based simulation solutions do not provide a true approximation of real-world driving scenarios. They have a limited field of view and cannot resolve objects at distances of less than 4 meters. Some of these systems use multiple radar target simulators, each presenting point targets to radar sensors and emulating horizontal and vertical position by mechanically moving antennas around. This mechanical automation slows overall test time. Other solutions create a wall of antennas with only a few target simulators, enabling an object to appear anywhere in the scene, but not concurrently. In a static or quasi-static environment, this approach enables test with a handful of targets moving laterally at speeds that are limited by the speed of robotic arms.
Current simulators can emulate a maximum of just 32 objects – including vehicles, infrastructure, pedestrians, obstacles, and other objects. This is far fewer objects than a vehicle traveling on the road may encounter at any given time. Testing radar sensors against a limited number of objects delivers an incomplete view of driving scenarios and masks the complexity of the real world.
To advance autonomous driving technology to Level 4 and Level 5 autonomy, automotive OEMs need solutions capable of rendering more objects faster and at closer distances. To help bridge these gaps, Keysight developed a proprietary scalable emulation screen that combines hundreds of miniature target radar simulators and can emulate up to 512 objects at distances as close as 1.5 meters. The result is a deterministic real-world environment for lab testing complex scenes that previously could only be tested on the road.
We at Keysight are extremely proud of these technology breakthroughs and the resulting Radar Scene Emulator, a key part of our Autonomous Drive Emulation (ADE) platform. We believe this technology goes a long way toward bridging two important gaps on the road to Level 5 vehicle autonomy, bringing safer and more efficient transportation with fewer fatal accidents and less time wasted in traffic.