ADAS Sensors Create a Cleaning Nightmare

Article By : Brian Santo

Vehicle sensors are going to get occluded by heavy rain, spattered by mud, and coated with splattered bugs. Now what?

The unintended consequence of equipping cars with multiple types of sensors for driver assistance or for autonomous driving is that automakers are almost certainly going to have to add yet more subsystems to keep those sensors clear of obstructions.

The automotive industry is aware of the situation, but few auto makers have decided how to address the problem. While a blocked sensor could potentially cripple an autonomous vehicle (AV), human motorists are apt to experience a similar problem merely as an inconvenient disruption of an assisted-driving (ADAS) feature. Modern vehicles all have some version of the error message that an assisted driver feature isn’t available at the moment.

Furthermore, in the case of ADAS, it is possible that simply having multiple sensor types available will address the problem in the short term. We recently discussed that with Willard Tu, the senior director of the Automotive Business Unit at Xilinx. We’ll get to our discussion with Tu in a moment.

Meanwhile, while there are likely to be few autonomous passenger vehicles in the near future, there will be some (plus a growing number of delivery and freight vehicles, drones, and other autonomous systems), so the problem needs to be solved sooner rather than later.

Whatever the solution, it’s going to add more to overall vehicle cost. Possibly in terms of more equipment, possibly in terms of redesigning vehicles to make room for additional subsystems, perhaps in terms of processing power, but certainly in terms of dollars.

On one extreme of opinion is that sensors must remain spotless. That view is prominent among auto parts suppliers who have developed sensor-cleaning devices. The other extreme is occupied by some lidar and radar engineers who dismiss the problem of partially covered sensors as something that artificial intelligence will eventually handle with ease, given sufficient training.

…Given sufficient training. 

Keep in mind that most companies testing ADAS and AV technologies are logging most of their actual road test miles in urban/suburban areas in places like Arizona and California. They know they’re living in a fantasy world where almost every day is clear, and it’s rare to encounter anything that might obscure their sensors.

The obvious cleaning mechanism is the one that automakers have employed for decades to keep windshields clear: dispense cleaning solutions through spray nozzles, and then use wipers to sweep away the cleaning liquid and dirt. Waymo, for example, uses this approach for its sensor domes.

But adding wiper blades becomes increasingly impractical as the number of sensors (or sensor clusters) that must be cleaned increases.

Resorting to a simple rinsing mechanism without wipers is problematic, though, because beads of cleaning liquid could be just as obstructive as the dirt being cleaned away, especially with optical cameras. Some auto parts suppliers (including dlhBowles and Kendrion) are already of the opinion that the cleaning mechanism will have to combine a liquid sprayer with an airjet drying component.

One proposal is to design vehicle bodies in such a manner that airflow naturally keeps dust and grime away from the sensor suites most of the time. “Most of the time” of course implies that some of the time sensors will inevitably get dirty — then what? Yet another option being explored is the use of ultrasonic cleaning.

Auto makers are still evaluating all these possible solutions — even as they’re in the process of adding more sensors to their vehicles every year.

The typical rationale stated for equipping vehicles with a combination of different sensors is that multiple streams of different sensor data complement each other, together providing better ADAS and AV performance than any single sensor mode.

But there’s another reason why adding different kinds of sensors is important. When the performance of one sensor type is degraded or even completely curtailed by rain, mud, spattered bugs or other environmental schmutz, it is critical to have a back-up stream of incoming data from a different sensor that might be less impaired (or might even be impervious) to whatever the obstruction is.

This is part of the reason why Tesla is getting so much attention for saying in May it is going to give up on radar and rely completely on cameras in its vehicles. Anybody thinking it through for a few minutes knows there will inevitably be situations in which Tesla vehicles will get splashed with mud and go blind. That’s why few automotive experts interpret Tesla’s decision as anything but a temporary situation.

We asked Willard Tu from Xilinx about how different sensors behave with different types of obstructions (Xilinx FPGAs are being used by some automotive companies for processing sensor data).

The advantage of radar over vision systems, for example, is that it radar can cut through rain and fog, even if those conditions might limit a radar system’s range, he explained.

Meanwhile, “mud is unique,” Tu said. Radar can penetrate dried mud easily, “but wet mud? Water is conductive and it can distort the signal,” Tu noted.

Below is a diagram showing how multiple cameras, radars and lidars can complement each other. Some vehicle manufacturers are evaluating all three, many are trying to determine if they can get away with two of the three (and in which combination). Few other than Tesla are considering a single sensor type.

Radar applications. (Click on the image for a larger view.)

“Me? I think you should use two types,” Tu said. “Lidar and radar are both good at ranging. A camera is good for color, and lidar and radar can’t do that. On the other hand, lidar can read a sign sometimes, if there’s enough of an edge on the writing. There are always trade-offs in cost and performance.

“Most car companies want a camera. That’s cheap and easy,” Tu continued, “then maybe a lidar or a radar.”

But the decision is more complicated than which sensor to use, or which combination, Tu explained.

The more sensor data a vehicle collects, the more processing power is required. But auto manufacturers have a lot of leeway in balancing data volumes they collect against the power of the processing they need against the sophistication of their artificial intelligence.

“Good AI is key,” Tu said. He shared an image, a point cloud from an actual sensor reading that a Xilinx customer assured Tu had been used to correctly identify everything in view. It was so profoundly sparse neither of us could even guess what might have been in the scene – the point being that sophisticated AI can do more than might be expected with what seemed to be minimal data.

This is what lidar and radar engineers are getting at when they say AI can be used to compensate for partially obscured sensors. Similar to the way AI can function with a relatively small amount of data, AI can be used to solve the problem of getting less data than ordinary, with sufficient training.

Xilinx silicon can occupy several functions in automobiles. (Click on the image for a larger view).

This article was originally published on EE Times.

Brian Santo is Editor-in-Chief of EE Times. He has been writing about technology for over 30 years, for a number of publications including Electronic News, IEEE Spectrum, and CED; this is his second stint with EE Times (the first was 1989-1997). A former holder of a Radio Telephone Third Class Operator license, he once worked as an engineer at WWWG-AM. He is based in Portland, OR.

 Lucky Draw 2021

Leave a comment