Vayyar's radar-on-a-chip seeks to provide higher levels of automation for driver assistance.
Improving the “eyes” of driver assistance systems is extending beyond cameras and lidar to include development of new sensors capable of handling complex driving scenarios, or what the auto industry calls Level 4, or “high” automation.
Among the developers is Vayyar Imaging, the Israeli sensor specialist. It’s XRR platform aimed at advanced driver assistance systems (ADAS) is a single 4D imaging radar chip with a range of up to 300 meters. The radar chip also provides a 180-degree field of view, operating without the need for an external processor.
The 4D feature refers to the chip’s ability to measure distance and relative velocity along with the azimuth of objects and their height relative to the road level.
In an interview, Ian Podkamien, vice president and head of Vayyar’s automotive sector, said a 48-antenna MIMO array supports the new platform, which is also AEC-Q100-qualified and ASIL-B compliant. “The multi-functionality of the RFIC eliminates the need for external devices such as lidar sensors, reducing not only cabling costs but also power consumption and integration efforts,” Podkamien claimed.
The multi-range XRR chip operates in the 76-81GHz radar bands and can differentiate among static obstacles such as dividers, curbs and parked vehicles along with moving vehicles and other hazards.
In low-speed environments such as parking lots, the chip scans the surroundings for pedestrians and obstacles using ultra-short and short-range radar imaging detection. At longer ranges, the radar chip enables ADAS applications such as adaptive cruise control, blind-spot detection, collision warning, cross traffic alerts and autonomous emergency braking.
Radar in its most rudimentary form transmits a signal that bounces off an object, revealing its presence and range. By sending a signal at a specific frequency, the system then analyses the return frequency. For ADAS, the difference between the two, including the possible Doppler effect, determines the position, distance and speed of obstacles.
Based on integrated electronics, radar is also able to scan the surrounding environment. Radar has therefore become a critical sensor for applications like collision avoidance because it works in the dark, in adverse weather conditions and is relatively inexpensive.
Podkamien said 4D imaging radar promises much more, providing nearly 500 virtual channels (as opposed to one channel in traditional radar). Vayyar’s single-chip 4D imaging radar is expected to be incorporated into vehicles in 2023, playing a growing role in ADAS applications ranging from cabin monitoring systems and child presence detection to seat belt reminders and intruder alerts.
Unlike cameras and lidar, 4D imaging radar works in all conditions, including fog, heavy rain and at night. Its longer range meets requirements for higher levels of vehicle automation. Radar also captures Doppler shifts, which detect whether an object is moving towards the vehicle or away.
Unlike cameras and lidar, 4D imaging radar uses echolocation and the principle of time-of-flight measurement to scan its surroundings. Along with 300-meter range, radar also performs well in snowstorms, when imaging is very difficult.
The 4D sensor uses time variables to analyze 3D environments for elevation. This can help detect and identify stationary objects along the roadway.
Being able to scan the roadside environment around the vehicle with increasing precision and definition would allow onboard electronics to interpret greater amounts of data, requiring higher processing speeds for ADAS application. The result, Vayyar asserts, is greater reliability.
The company’s radar-on-chip also incorporates an internal DSP and MCU for real-time signal processing without the need for an external CPU.
New European New Car Assessment Program (NCAP) specifications call for improved detection of pedestrians. Vayyar said its platform supports nine different Euro NCAP ADAS requirements for protecting pedestrians, cyclists and motorcyclists.
“No longer will it be necessary to fit a sensor for every single function and expensive onboard lidar and cameras,” Podkamien claimed. “Just one chip will suffice, meeting all Euro NCAP 2023 and 2025 requirements.”
The move to greater autonomy, which completely excludes the driver from direct control of the vehicle, will require development of sensing systems that provide a real-time, 360°-view around a car. Developing these capabilities will require continued innovation in semiconductor technology, RF system operation and signal processing—areas Vayyar further claims it is pioneering with its 4D imaging radar.
This article was originally published on EE Times.
Maurizio Di Paolo Emilio holds a Ph.D. in Physics and is a telecommunication engineer and journalist. He has worked on various international projects in the field of gravitational wave research. He collaborates with research institutions to design data acquisition and control systems for space applications. He is the author of several books published by Springer, as well as numerous scientific and technical publications on electronics design.