Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
 
EE Times-Asia > Embedded
 
 
Embedded  

Toyota miscalculates in safety-critical considerations

Posted: 02 Apr 2014  Print Version  Bookmark and Share

Keywords:Toyota  Michael Barr  safety-critical system 

A U.S.-based software consultant has argued in favour of government regulation and supervision of safety critical embedded systems, convicting Toyota Motors for its responsibility in a runway acceleration accident involving a 2005 Toyota Camry.

Michael Barr, co-founder and CTO of the Barr Group, told an audience of embedded system engineers at the EE Live! that as automobile manufacturers have pushed each other into a race to fit cars with complex electronic control systems, watchdogs at the National Highway Traffic Safety Administration (NHTSA) have failed to keep pace. Lacking a team of experienced experts to test and monitor today's flood of automotive software designs, NHTSA is failing in its mission to oversee "safety-critical systems."

Despite assurances by companies like Toyota that their software undergoes rigorous testing, said Barr, the rush to get cars on the road means that "You, the users, have been testing the software."

In some cases, like that of Jean Bookout, who was seriously injured when her 2005 Toyota Camry accelerated unintentionally, that sort of ad hoc consumer testing can result in catastrophe. A passenger in the Bookout car, Barbara Schwarz, was killed. After Barr testified at length for the plaintiffs—in the only software-focused Toyota case that has been tried—an Oklahoma City jury agreed to award $3 million to Ms. Bookout and to Ms. Schwarz's family.

Commitment to a culture of safety

Although insisting on tighter NHTSA regulation, Barr did not absolve carmakers, whose current passion has been described as turning every new car model into a giant, apps-loaded smartphone.

Barr said that Toyota, and by implication other auto companies eager to load their products with electronic controls, lack a "mature design process, done right, documented, and peer reviewed."

He called for carmakers—regardless of the government's role—to adopt a "company culture and an engineering culture of wanting to know what can go wrong, and wanting to fix what can go wrong, from the outset," rather than after-the-fact with apologies and million-dollar settlements.

Since the problem of "unintended acceleration" in Toyotas burst into headlines after a ghastly California crash that killed Mark Saylor, a 19-year California Highway Patrol veteran, and three family members, Toyota has recalled millions of cars and paid billions in penalties and settlements. Among these was a $1.2 billion criminal fine imposed last month by the Department of Justice—for lying to government regulators.

Using an exhaustive 56-slide PowerPoint presentation and citing his 18 months examining Toyota's automotive software "source code," Barr convinced the Oklahoma jury that Toyota had deployed dangerously flawed software in its cars. Despite Barr's findings, Toyota continues to claim that all its unintended acceleration problems were mechanical, the result of misplaced floor mats and "sticky" gas pedals.

Neither NHTSA, with its absence of software expertise, nor the NASA Engineering and Safety Center—to which NHTSA turned to study the Toyota problem—were able to pinpoint a software cause for unintended acceleration. Nor were they able to rule out the possibility.

The NASA researchers, who were both on a deadline and not allowed to study Toyota's source code, simply ran out of time, noted Barr.

Under court order, a team from the Barr Group was allowed into a specially built "code room" provided by Toyota. They were able to pinpoint at least one anomaly that could have caused Toyota accelerators to build up speed while disabling the brake system. Barr also found numerous Toyota violations of software design standards. Toyota, in many instances, even broke its own rules for safe design and system redundancy.

Patriot missiles, Therac-25, and others that failed

Many of these rules, and Toyota's subsequent actions, were either buried in corporate secrecy or covered over by corporate denial. "The answer is not to say it can't be the software, stick our heads in the sand," said Barr. If companies like Toyota examined themselves more rigorously, he added, and allowed "less code confidentiality," they wouldn't require as much regulatory scrutiny.

Barr cited past cases of "safety-critical systems" that failed but then were corrected when regulators stepped up their intensity and capabilities. After a series of radiation overexposures—including two fatalities—caused by a software glitch in a radiotherapy machine called the Therac-25, the Food and Drug Administration created an in-house team of software engineers to review every electronic medical device before its approval for use on patients.

In the case of the Therac-25, in the case of a software-misguided Patriot missile that killed 28 US troops during the Gulf war, and in Toyota's case, the companies responsible have invariably issued assurances about their exhaustive testing and cited "no other instances of similar damage."

Such assurances disregard the bugs that exist in every complicated system and the harm they can cause. "If you are overconfident of your software in a safety-critical system, that could be deadly," said Barr.

- David Benjamin
  EE Times





Article Comments - Toyota miscalculates in safety-criti...
Comments:  
*  You can enter [0] more charecters.
*Verify code:
 
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

 
 
Back to Top