Highlights

Test Radar-based Autonomous Driving Features with the Radar Scene Emulator

Achieving the next level in vehicle autonomy demands robust algorithms trained to interpret radar reflections detected by automotive radar sensors. Keysight’s first-to-market technology combines hundreds of miniature radar target simulators into a scalable screen that can emulate objects with up to 512-pixel resolution and at distances as close as 1.5 meters. This breakthrough radar scene emulation solution overcomes conventional radar sensor test solutions that have a limited field-of-view (FOV) and cannot simulate objects at distances less than 4 meters.

Utilizing “total scene generation,” the radar scene emulation solution exercises your automated drive systems and algorithms by applying time-synchronized inputs to the actual sensors. Its open architecture also closes the loop with your existing hardware-in-the-loop (HIL) systems and 3D modelers. These capabilities create a solution that complements—and fills the gap between—software simulation and on-road testing. As such, it overcomes limitations of software simulation that does not test real radar sensor response, while achieving repeatable testing of radar scenes, which cannot be done on the test track.

The Radar Scene Emulator allows you to emulate real-world driving scenarios, varying speed, distance, and number of targets, across a contiguous FOV. With radar sensors and back-end software confidently tested against the complexity of real-world driving scenarios, you’ll achieve your vision of ADAS and next-generation vehicle autonomy sooner, with less risk.

In-lab full radar scene emulation

  • Thoroughly exercise radar sensors and systems with up to 512-pixel resolution a contiguous horizontal FOV of ±70 degrees and vertical ±15 degrees
  • Supports short, medium, and long-range mmWave radars by generating static and dynamic targets at ranges of 1.5 meters to 300 meters and with velocities of 0 to 400 km/h
 
  • Address multi-target, multi-angle scenarios with mechanically fixed RF Front ends that provide repeatable angle of arrival (AoA) accuracy
  • Emulate complex, RF-dense urban scenes with realistic interference testing   
  • Using 3D point clouds and multiple reflections allows for improved detection and differentiation of objects

Keysight Radar Scene Emulator Honored with Awards®

  • AutoSens Awards 2022 – Silver Winner: Best Validation/Simulation Tool
  • The Electronics Industry Awards 2022: Automotive Product of the Year – Highly Commended
AutoSens Awards 2022 Silver Winner & The Electronics Industry Awards 2022

Key Specifications

Horizontal FOV
±70 degrees
Maximum Target Distance
300 meters
Minimum Target Distance
1.5 meters
Speed Range
±400 km/h
Vertical FOV
±15 degrees
Horizontal FOV
Maximum Target Distance
Minimum Target Distance
Speed Range
Vertical FOV
±70 degrees
300 meters
1.5 meters
±400 km/h
±15 degrees
Mehr anzeigen
Horizontal FOV:
±70 degrees
Maximum Target Distance:
300 meters
Minimum Target Distance:
1.5 meters
Speed Range:
±400 km/h
Vertical FOV:
±15 degrees

Rev Up Your Knowledge On Radar Scene Emulation

White Papers 2021.04.08

Accelerate the Development of Advanced Driver-Assistance Systems

Accelerate the Development of Advanced Driver-Assistance Systems

Keysight’s Autonomous Drive Emulation (ADE) platform is the environment emulator for in-lab testing versus realistic roadway scenarios, from mundane to one-in-a-million. Using total scene generation, the platform exercises ADAS software using time-synchronized inputs to the actual sensors. Its open architecture also closes the loop with your existing HIL systems and 3D modelers. Enabling you to keep pushing ADAS towards Level 5. Generate total roadway scenes with multi-faceted environment emulation. ADE fills the gap between software simulation and on-the-road testing of ADAS capabilities by performing holistic validation of the actual sensors, ECU software, AI logic, and more. Accelerate the development of new ADAS software features and gain deeper insights into the ADAS software behavior earlier in the development cycle. Develop greater confidence in the validation process by thoroughly validating line-of-sight-based sensors such as radar and cameras and with synchronous testing of communication-based systems (e.g., C-V2X). Cover dangerous situations, risky corner cases and more, in the lab by emulating complex situations with time-synchronized stimulation of the real sensors to be used in vehicles. Find potential issues earlier in the development process, reducing the likelihood of post-release failures. Ultimately test more scenarios, sooner, and achieve greater confidence in required ADAS functionality.

2021.04.08

Want help or have questions?