We Really Need to Talk About Mode Confusion

Article By : Colin Barnden

We hear repeated claims from AV advocates that using technology to replace humans as drivers will lead to safer roads. But where is the evidence?

We hear repeated claims from AV advocates that using technology to replace humans as drivers will lead to safer roads. But where is the evidence? Moreover, where is the progress?

Privately owned passenger vehicles are not going to be autonomous, or self-driving, any time soon. As engineers, we know this, so why aren’t more of us speaking up and having this conversation?

Common sense tells us that bold autonomous vehicle (AV) promises goosed with starry–eyed optimism cannot magically solve a problem of more than 1.3 million road deaths globally each year. Technology is one part of the solution but is not a panacea.

Today, no proponent of AV technology has a definitive answer to the question of “how safe is safe enough?” that isn’t based on comparisons with an “average human driver.” Who is liable when a machine driver kills? We don’t know. Is more driving automation demonstrably safer than more driver monitoring? We don’t know that either.

Let’s get back to first principles and describe the road ahead in the simplest terms. Technology offers two routes to safer driving. These are:

  • To make human drivers into safer drivers
  • To replace humans as drivers

For years, many technology companies have hyped the possibilities of AVs but then under–delivered on their promises. Timescales for mass AV deployments have steadily slipped from 2018 to 2020 and now to 2024 and beyond. The situation would be comical if the subject were not so serious.

No trend toward autonomous

For too long, technology suppliers have relied on references to the levels of driving automation described in standard SAE J3016 as a substitute for substantive progress. First, this was a race to Level 5, and then to Level 4. Now, headlines are made with announcements for Level 3.

For all the inherent reassurance of a numbering system, the levels described in J3016 tell us nothing about the journey to safer roads. Phil Koopman, associate professor at Carnegie Mellon University, recently proposed the adoption of four categories of vehicle operation to replace J3016. These are driver assistance, supervised automation, autonomous operation, and vehicle testing.

Driver assistance comprises the familiar and well–established warning and intervention technologies, such as automatic emergency braking, forward–collision warning, lane–keep assist, and lane–departure warning. Supervised automation describes technology that provides automated longitudinal and lateral vehicle control, such as Ford’s BlueCruise, GM’s Super Cruise, and Tesla’s Autopilot.

Koopman defines autonomous operation as “the whole vehicle is completely capable of operation with no human monitoring.” Vehicle testing is defined as “a trained safety driver supervises the operation of an automation testing platform.”

Mapping Koopman’s definitions of vehicle operation onto a forecast for world light vehicle production produces the trend shown below, which includes a category called “No Assist” for completeness.

The conclusion is obvious: There is no trend to autonomous operation in privately owned passenger vehicles, at least throughout this decade. Driver assistance, yes. Supervised automation, shown above in blue, yes. But autonomous operation in consumer vehicles? No.

Who is the driver?

In privately owned passenger vehicles, an unambiguous declaration of liability must be added to the categories of vehicle operation.

  • For driver assistance and supervised automation, the human driver must be legally liable at all times.
  • For autonomous operation, the human driver (the person sitting in the driver’s seat) must bear no legal liability whatsoever for decisions made by the autonomous driving system.

From a safety perspective, either the human is driving and is liable, or the machine is driving and is liable. There is no way to mix these two modes safely in real–world conditions. However, in a world of technology, innovation, and marketing, who cares about road safety?

The first automaker to launch a system that blurs driving liability is Mercedes, with the introduction of Drive Pilot. Mercedes has declared it will assume legal liability for crashes that occur when Drive Pilot is engaged. Let’s look at that claim in further detail.

An article by Road and Track describes the operation of the system:

“Once you engage Drive Pilot, you are no longer legally liable for the car’s operation until it disengages. You can look away, watch a movie, or zone out. If the car crashes while Drive Pilot is operating, that’s Mercedes’s problem, not yours.”

Great, but what if the crash results in the death of another road user? Who is liable then? That legal situation is unclear and evidently will vary by jurisdiction and circumstance.

The article continues:

“Since Drive Pilot (and its manufacturer) are legally responsible for the vehicle’s operation, the software has to follow the law. The complexities of clearing a path for an approaching ambulance, fire truck, or police vehicle are beyond the system’s abilities; instead, the software uses microphones and cameras to detect emergency lights and sirens far enough in advance to issue the full 10–second warning before manual takeover.”

Now we can see the caveats behind the headline. What happens after the 10–second machine–to–human handover? Who is liable then? That isn’t clear either. The caveats continue:

“There’s a catch, of course: Handing over driving responsibility completely requires extremely particular circumstances. Right now, Drive Pilot can only engage at speeds under 40 mph (60 km/h in Germany) on limited–access divided highways with no stoplights, roundabouts, or other traffic control systems, and no construction zones.”

Drive Pilot has been approved for use on 13,191 km of German highways, including autobahns, but only at speeds up to a maximum of 37 mph (60 kph). The caveats go on:

“The system will only operate during daytime, in reasonably clear weather, without overhead obstructions. Inclement weather, construction zones, tunnels, and emergency vehicles will all trigger a handover warning. And no, you can’t close your eyes or go to sleep while it operates.”

Thus, Mercedes is only liable for Drive Pilot until it isn’t, and then liability is handed right back to the human driver with a 10–second handover. This creates a situation known as “mode confusion.”

Further, use a clock to observe a period of precisely 10 seconds. How often does a safety–critical driving situation include such lengthy notification? Ten seconds might constitute a sufficient warning in a complicated transport system such as aviation or rail, but roads are complex, not complicated.

Complex systems demonstrate phase transitions, from order to disorder and back to order. Any experienced driver knows that random events happen suddenly, without warning, and which may or may not lead to a crash or loss of life.

An alert, engaged, and unimpaired human driver is currently vastly better than a machine driver at demonstrating judgment under uncertainty in a safety–critical situation. Now add in mode confusion, with the associated instantaneous uncertainty about who is liable for the driving task, and road safety is compromised, not enhanced.

In an article on Ojo-Yoshida Report entitled “Who’s Driving — and Crashing — My Robocar?” Daniel Hinkle, senior state affairs counsel at the American Association for Justice, said, “Defining ‘who is the driver’ is the most salient and pressing question whenever you’re drafting legislation regarding automated vehicles.”

In the same article, William Widen, professor at the University of Miami School of law, said, “The resolution on the issue [of accountability] should not be left for litigants and the courts.”

Commenting on Drive Pilot, Matthew Avery, chief research strategy officer at Thatcham Research, said, “Trust will diminish if confusion reigns and drawn–out legal cases become common, hampering adoption of the technology and the realization of its many societal benefits.”

Perhaps an honest, straightforward description of a driving automation system compromised by mode confusion and blurred liability would say something like this:

Designed to throw you under the bus, assuming it doesn’t crash into it first.

Fight like hell for safer roads

In the closing speech at the 2022 Lifesavers National Conference on Highway Safety, National Transportation Safety Board (NTSB) chair Jennifer Homendy spoke with both passion and conviction as she encouraged safety advocates to “fight like hell for safer roads.”

The video of the speech is below, and if you work in the C–suite, legal, or marketing department at an automaker or tech company, take the time to watch it in full.

Really watch it. Listen to the tone of Homendy’s voice and observe her facial expressions as she mentions two children killed on roads in Washington, D.C. Look right into her eyes as she describes the loss of her friend Larry, “who died in 2020 doing the thing he loved, riding his bike in Montgomery County.” Know that Homendy is serious when she says the goal for deaths on U.S. roads “has to be zero.”

Six months into the role, Homendy has made little specific mention of technology but has consistently called for the adoption of a Safe System Approach and has aggressively called out the statistic so loved by AV technology companies: that 94% of road deaths are the result of human error.

The term “automation complacency” comes from NTSB’s multiple highway investigations into Tesla Autopilot crashes and the repeated safety recommendations for improved driver monitoring. With decades of experience investigating incidents in the aviation sector, NTSB is perhaps the world’s foremost expert agency on mode confusion in automation system design.

The introduction of Mercedes Drive Pilot represents a significant backward step for road safety and further continues the worrying trend among automakers toward unsafe innovation. Although currently only approved for use in Germany, a U.S. launch of Drive Pilot would place Mercedes onto an inevitable collision course with Homendy.

As engineers, we can fight like hell for safer roads by joining our colleagues in safety advocacy and leading the conversation on difficult issues like mode confusion. The safe, autonomous future promised by automakers and the tech industry is going to be dependent on humans acting as automation babysitters for many years to come, both in AV test vehicles and consumer vehicles.

We know that it is easier, faster, and cheaper to use technology to make human drivers into safer drivers than to replace humans as drivers. It is time to start saying so.

This article was originally published on EE Times.

Colin Barnden is principal analyst at Semicast Research and has over 25 years of experience as an industry analyst. He is considered a world expert on market trends for automotive vision-based driver monitoring systems (DMS). He holds a B.Eng. (Hons) in Electrical & Electronic Engineering from Aston University in England and has covered the automotive electronics market since 1999.

 

Subscribe to Newsletter

Leave a comment