After more than a year of suspense, the U.S. National Transportation Safety Board (NTSB) has released a 500-page document about the fatal highway crash involving a Tesla S and a tractor-semi trailer truck.

The NTSB probe is a treasure trove of data, particularly for the automotive industry that is currently fixated on the development of self-driving cars. The docket includes reports on highway design, vehicle performance, human performance and motor carrier factors. The crash reconstruction report, included in the docket, describes the crash sequence, along with interview transcripts and summaries, photographs and other investigative details. However, the investigation fell short of determining who, or what, is to blame for the death of the driver.

The NTSB emphasised that the docket contains “only factual information collected by NTSB investigators” and it “does not provide analysis.” It said no conclusions about how or why the crash occurred should be drawn from the docket. The NTSB is planning to release its own analysis, findings, recommendations and a judgment about probable cause “at a later date.”

Using system performance data downloaded from the passenger car, the report states that its speed just prior to impact with the trailer was 119kph (74mph). The data also revealed that the driver was operating the car using automated vehicle control systems, a Traffic-Aware Cruise Control (TACC) and the Autosteer lane-keeping system.

Among the documents released last week, the one entitled “Driver Assistance System” poses particular interest for EE Times readers.

The report delves into details of how Tesla’s driver assistance system—consisting of a Bosch radar system, Mobileye image capture & processing system, an ultrasonic sensor system and gateway electronic control unit—works. It literally reads like a teardown of the Model S driver-assistance system.

It also contains a few surprises, including how and where Tesla’s captured data is stored, routed and saved inside a vehicle, and how it’s sent to Tesla’s server using a virtual private network connection established via Wi-Fi or using the vehicle’s 3G cellular data capability.

The report says that Tesla S stores non-geo-located data in-vehicle in non-volatile memory using a removable SD card installed within the Gateway ECU.

Really, an SD card? What role does the SD card play?

No Event Data Recorder?

After reading the report, Mike Demler, a senior analyst at The Linley Group, told EE Times, “I find the description of some of Tesla’s control and data-recording systems to be interesting.” In particular, he said, “The statement in the report that says, ‘This SD card is large enough to typically maintain a complete record of all stored data for the lifetime of the vehicle’ is interesting.” He asked: “How could they determine how much data will be generated over the lifetime of the vehicle?”

Unfortunately, the NTSB report doesn’t answer such questions.

But one thing is clear. The NTSB apparently sees this “removable” SD card as a proxy for an event data recorder (EDR). Because current NTSB specs do not require an EDR (it's completely voluntary), the NTSB appears to conclude that Tesla did enough.

Danny Kim, director and partner of Vision Systems Intelligence (VSI), told EE Times, “The Tesla Model S involved in this crash did not, nor was it required to by regulation, contain an event data recorder.” Even had it included an EDR, he noted that “The current EDR, outlined by the regulation, is outdated and it does not reflect what autonomous vehicles can do.”

Kim said, however, that sooner or later there will probably be a new regulation including an EDR mandate, per NHTSA’s recommendations last fall.

To Tesla’s credit, when NHTSA sent Tesla an extensive list of questions, the automaker was able to offer much more than what a current EDR can provide, Kim observed. “The most notable points include a detailed list of all of Tesla’s Autopilot capable vehicles including the VIN, model, model year, total mileage with Autosteer on, total number of ‘Hands on Wheel’ Autosteer warnings records, etc.”

Hands on Wheel

Now, of course, the most contentious issue that emerged from the fatal crash last year was if the driver kept his hands on the wheel while using the autopilot mode.

Based on recovered data, the NTSB confirmed that during a 37-minute period when the driver was required to have his hands on the wheel, he did so for just 25 seconds.

The report said Autopilot was activated through most of his trip. It gave the driver visual warnings seven separate times that said "Hands Required Not Detected."

Six times, the system sounded a chime along with the "Hands Required" warning.

NTSB_tesla-crash_01 (cr) Figure 1: Passenger car damage from impact with semitrailer (Source: Florida Highway Patrol)

Phil Magney, founder and principal at VSI, told EE Times, “Tesla Auto Pilot monitors hands on state continuously. If the driver is in active autopilot mode and does not have his/her hands on the wheel the vehicle will chime and flash to gain the attention of the driver. After repeated warnings, if the driver refuses to grab the wheel the system will disable itself for the duration of the drive cycle.”

The problem, as Magney acknowledged, is that “It is easy for the Tesla driver to fool the system by casually resting their hands on the wheel.”

Magney noted, “For Tesla Auto Pilot systems (AP 1 and AP 2) the primary method to measure driver engagement is through the steering wheel sensors. We don't think that is an adequate means of measuring driver attentiveness. And it is for this reason that GM (SuperCruise) is using driver monitoring ‘facial recognition’ via camera to determine if the driver is awake and attentive.”

Magney explained that the GM system does not require hands on the wheel. This is why GM claims to be the first “hands-free" automated solution. “In my opinion, the GM solution is a better method to handle the task of measuring driver engagement,” Magney said.

Strategy Analytics analyst Angelos Lakrintis also told us after reading the report, “It genuinely sounds like the autopilot failed to prevent a collision. Whether right or wrong, we would expect it to work before it is broadly deployed as Autopilot, DrivePilot, ProPilot etc.”

Linley Group's Demler concluded, “In any event, this reinforces the thinking that Level 3 (although Tesla describes Autopilot as Level 2) ADAS is fraught with danger, since it relies on handover to human drivers.”

Tesla last fall unveiled improvements in Autopilot, adding new limits on hands-off driving and other features that likely would have prevented the crash death. The updated system temporarily prevents drivers from using the system if they do not respond to audible warnings to take back control. But the system still depends on steering wheel sensors to judge if a driver is paying attention to the road.

 
Next: Tesla crash probe: What recovered data didn't show »