Junko Yoshida fleshes out the first fatal autonomous vehicle incident, new rules, third-party testing needed?
A self-driving Uber vehicle, driving in autonomous mode with a safety driver behind the wheel, hit and killed a pedestrian Sunday night in Tempe, Arizona.
The preliminary investigation shows that Uber’s robocar, identified as 2017 Volvo XC90, was driving “at approximately 40 miles per hour,” said Tempe police during a press briefing Monday.
Police told reporters that they found “no significant signs of [the Uber vehicle] slowing down” as it approached the pedestrian with a bicycle crossing the street outside a crosswalk.
It remains unclear if the accident, which appears to be the first reported fatal crash involving a self-driving vehicle and a pedestrian in the United States, might trigger a slowdown in the autonomous vehicle race.
Heated discussions, however, have already erupted on social media forums. One investor on LinkedIn deemed the death the price of progress: “We lose one to save thousands; it’s called acceptable casualties.” An insurance analyst saw the accident as a novel — and unwelcome — hazard: “… you did sign up to face drunk drivers and cars with flat tires careening out of control and a host of other risks from cars on the roads because those risks are part and parcel of vehicles being driven by people … but no one signed up to face AVs.”
This tragedy is certain to affect consumer perceptions of autonomous vehicles and, more significantly, the AV discussions unfolding in Washington, D.C.
Phil Magney, founder and principal of VSI Labs, told us, “This accident comes at an unfortunate time for the AV Start Act, which is trying to make its way through the U.S. Senate.” He explained that the House of Representatives passed a similar bill unanimously in 2017, the SELF DRIVE Act.
Magney suspects, “Opponents in the Senate could use this incident as ‘proof’ that self-driving cars are too dangerous.”
Both the National Highway Traffic Safety Administration (NHTSA) and National Transportation Safety Board (NTSB) have been called out to investigate the latest Uber accident. Magney cautioned, “Any conclusions before the NHSTA and NTSB reports would be premature.” Referring to the new bill, he noted, “Politicians may not want to vote on the SELF DRIVE Act before hearing the reports.”
Police acknowledged the existence of “video evidence” from multiple cameras equipped in the vehicle. The footage captures “several angles,” said the police, showing both “the driver and outside the vehicle to the front of the vehicle.” The video evidence is not available to the public as long as the case is an ongoing investigation, according to the police.
Uber, meanwhile, announced Monday that it was suspending its self-driving car operations in Phoenix, Pittsburgh, San Francisco, and Toronto. The company said that it will fully cooperate with police.
While cautioning against a rush to judgement, VSI Labs’ Magney said, “The main question should be whether the automated vehicle is able to drive more or less safely than a human driver.”
Scene of the accident
The crash occurred near Mill Avenue and Curry Road around 10 p.m. The Uber vehicle, driving northbound, hit a woman with a bicycle. She was crossing the street but outside the crosswalk.
Magney said, “Mill Avenue is a divided road with two lanes of traffic flowing in each direction with a good-sized bike lane or shoulder on the right. I don’t know the speed limit but would guess 40+ mph.”
The Uber AV is assumed to have been driving within its lane and within the speed limit. Magney said that, in this case, Uber had both AV sensors and software, as well as a human safety driver who could have slammed on the brakes or swerved. Despite all of that available input, the car still hit the pedestrian, leading Magney to hypothesize that “the accident was very hard to predict or didn’t allow much reaction time.”
Because Uber AVs have numerous cameras, radar, and LiDAR, Magney is confident that “investigators will be able to see exactly what happened.” He added, “They’ll be able to see how much time the Uber AV or human had to react. Unlike many other auto accidents, AVs should be able to quickly provide all pertinent unbiased information about an incident.”
New road rules needed?
Carlos Holguin, the CEO of AutoKab, based in Paris, maintains that complete safety for autonomous vehicles is implausible with today’s road rules. Holguin said that safety boils down to “a trade-off between speed and risk.”
In his mind, questions that must be asked are whether Uber, software designers, the state of Arizona, and the governor have been all too casual about safety or too eager “to use the population as guinea pigs.” He noted that everyone, not just Uber, “should step back and think if the software that drives these vehicles is functioning correctly.” It’s time for “due diligence to avoid another fatality.”
AutoKab is a startup focused on developing “safety-assured automation for commercial vehicle fleets.” Holguin himself is an expert on safety and urban integration for highly automated vehicles. He was a member of the French team that developed the world’s first automated pilot services in public streets in 2011 and then went on to define safety guidelines for the second and third and fourth pilot programs.
Asked about “acceptable-casualties” comments such as “We lose one to save thousands,” Holguin recoiled. “My position is whether I would accept losing a loved one in the name of this argument,” he said. “I would not.” Sacrificing a child, spouse, or parent in the name of “progress” shouldn’t be acceptable to anyone, he explained.
Why no third-party testing?
Many technologists assert that all of the sensor technologies loaded into them will make robocars much safer than human-driven vehicles.
How do we know, though, that each of those AVs is safe without subjecting it to a rigorous round of third-party safety testing?
Regulating robotic cars is an issue that neither tech companies in Silicon Valley nor traditional automakers in Detroit are prepared to discuss. Thus far, the public sector has left AV safety issues up to private companies. Consumers, largely by default, are trusting vehicle vendors’ words when it comes to its safety — much like they’ve been trusting their privacy to Facebook and Twitter.
Holguin said that the testing of AVs by a third party “is much needed.” But politicians need to be trained, he explained, “to design the laws that request this,” aided by “technicians in the Department of Transportation to design the procedures.”
Holguin added, “We’re not in the 1920s, when the air transport industry was nascent. We are in the 21st century, and we have a century of experience in making transport safe.”
So the alibi that autonomous vehicles are occasionally fatal because they’re so new “is not an excuse to not implement well-known best practices,” he concluded.
— Junko Yoshida, Chief International Correspondent, EE Times