China crash puts Tesla under fresh scrutiny

Article By : Junko Yoshida

If reports are true, China’s crash fatality in January presents a problem for Tesla.

A fatal accident in China has thrust Tesla’s transparency into sharp focus, posing fresh and daunting questions as to how safe Tesla’s Autopilot really is.

New reports surfaced in China about a crash that killed a 23-year-old occupant while driving a Tesla Model S in Handan, a city about 300 miles south of Beijing. This took place on January 20, 2016—four months before Joshua Brown died in Florida, in a Tesla Model S on Autopilot.

The Chinese government news channel CCTV reported that the Chinese driver, Gao Yaning, borrowed his father’s Tesla Model S. He was driving on the highway, when his car hit a street-sweeper truck on the side of the road at highway speed.

CCTV showed a video footage of the accident captured by the Tesla Model S driver’s dash camera.

The police found no sign that the vehicle applied the brakes before hitting the truck. Local media reported that the Autopilot was engaged at the time of the accident. That crash, according to the Chinese reports, was under investigation for the first half of this year, the result of which is a lawsuit filed in July by the victim’s family against Tesla China.

Tesla’s credibility and transparency in question

If reports are true, China’s Tesla fatality in January presents a problem for Tesla.

The obvious questions are: when did Tesla learn of the fatality in China, and did the company report the crash to United States safety officials, who are still investigating the fatal accident in Florida.

More important are questions about the company’s credibility and transparency on safety matters.

After all, the only Model S fatality disclosed by Tesla, until the China revelation, is the Josh Brown case in Florida.

Tesla CEO Elon Musk has always maintained that there was only one known fatality in a Tesla on Autopilot. Noting that the fatal accident happened only after 200 million miles were driven, Musk has argued that Autopilot is “safer than manual driving.” He made a reference to the fact that a US fatality takes place every 94 million miles.

‘Jury is still out’

Philip Koopman, professor at Carnegie Mellon University, however, wrote earlier this week in his blog that Musk’s conclusion is “incorrect.” It’s tempting, but “far too simplistic an approach,” he noted.

In Koopman’ opinion, “the jury is still very much out on whether Tesla Autopilot will prove safer than everyday vehicles.”

He noted that “from a purely statistical approach, the question is: how many miles does Tesla have to drive to demonstrate they are at least as good as a human (94 million mile mean time to fatality).”

He posted this tool on his blog: “[This] tells you how long you need to test to get a 94M mile Mean Time Between Failure (MTBF) with 95% confidence. Assuming that a failure is a fatal mishap in this case, you need to test 282M miles with no fatalities to be 95% sure that the mean time is only 94 million miles.

“Yes, it's hard to get that lucky. But if you do have a mishap, you have to do more testing to distinguish between 130 million miles having been lucky, or just reflecting normal statistical fluctuations in having achieved a target average mishap rate. If you only have one mishap, you need a total of 446M miles to reach 95% confidence. If another mishap occurs, that extends to 592M miles for a failure budget of two mishaps, and so on. It would be no surprise if Tesla needs about 1-2 billion miles of accumulated experience for the statistical fluctuations to settle out, assuming the system actually is as good as a human driver.”

Further, he noted that looking at it another way, given the current data (1 mishap in 130 million miles), this tool tells us that Tesla has only demonstrated an MTBF of 27.4M miles or better at 95% confidence at this point, which is less than a third of the way to break-even with a human driver.

Koopman made it clear, though, that he is not calling Tesla cars only a third as safe as a human-driven vehicles. “What I am saying is that the data available only supports a claim of about 29.1% as good. Beyond that, the jury is still out.”

Koopman concluded that [Tesla] cars “need a whole lot more testing to make strong claims about safety.”

Tesla’s responses

After the Chinese report on Tesla’s fatality emerged this week, a news site called Electrek first reached out to Tesla for comments.

Apparently, Tesla now maintains that they are not able to access the vehicle’s logs, and therefore can’t confirm if Autopilot was engaged when the accident happened.

The statement Tesla issued this week is as follows:
We were saddened to learn of the death of our customer’s son. We take any incident with our vehicles very seriously and immediately reached out to our customer when we learned of the crash. Because of the damage caused by the collision, the car was physically incapable of transmitting log data to our servers and we therefore have no way of knowing whether or not Autopilot was engaged at the time of the crash. We have tried repeatedly to work with our customer to investigate the cause of the crash, but he has not provided us with any additional information that would allow us to do so.

Whether the fatality in China involved Autopilot is far from clear at this point.

EE Times exchanged a few messages on this matter with Koopman, who was also wondering why there is no Event Data Recorder with this data. EE Times reached out to Tesla for further questions, but Tesla has yet to respond.

This article first appeared on EE Times U.S.

Subscribe to Newsletter

Leave a comment