Definitely, Tesla is exonerated.For technical analysis, Yoshida went to Mike Demler of the Linley Group, a technology analysis and consulting company in the electronics industry. Demler had a lot to say, and none of it terribly flattering. He said NHTSA essentially had already decided the case by other decisions they've made along the way to this.
Did NHTSA let Tesla off the hook too easily? Absolutely.
What lessons or guidance, if any, did NHTSA’s findings offer to the rest of the automotive industry? Very little.
“The critical issue here is that the Tesla fatality involved the combination of AEB and Autopilot,” said Demler. As the report shows, NHTSA “already accepts that AEB reduces accidents, and that the Tesla AEB performed according to the current state-of-the-art,” he explained. “But what happens when you combine AEB with Autopilot?”But it didn't just involve the routine lane-centering of Autopilot, which is like cruise control for steering: "keep it between the dashed lines". This accident involved a truck turning left in front of the car and the Tesla's inability to tell a truck from the sky. I argue that once a child learns the words for truck and sky, age 3?, no child is going to make that mistake, but the car's decision cost the life of the Tesla's owner/driver. Furthermore, NHTSA's investigation said that truck should have been "observable" for 7 full seconds before the crash, time in which the driver should have been able to apply brakes and take control away from the car - if he knew he had to. NHTSA says:
He noted, “It would be different if Autopilot just added lane-centering to Adaptive Cruise Control (ACC) and AEB. [If that’s the case], Autopilot’s primary function isn’t safety, but it’s convenience.”
ODI (the Office of Defects Investigation)’s analysis of Tesla’s AEB system finds that 1) the system is designed to avoid or mitigate rear-end collisions; 2) the system’s capabilities are in-line with industry state of the art for AEB performance through MY 2016; and 3) braking for crossing path collisions, such as that present in the Florida fatal crash, are outside the expected performance capabilities of the system.The question is whether or not the driver knew to not let the system drive in that circumstance, and if the car tried to notify him at some point.
There really is a lot to this story and it's worth your time. Realize that in absolving Tesla of responsibility in this accident, NHTSA is implicitly encouraging more use of systems like this that are vulnerable in the event of a crossing vehicle. NHTSA is saying Tesla can't be held liable in a lawsuit; the problem was the driver. In saying NHTSA “already accepts that AEB reduces accidents, and that the Tesla AEB performed according to the current state-of-the-art,” they're committing to more cars with AEB systems. Furthermore, since we all know agencies like NHTSA work at a snail's pace, Tesla was able to end-run around them by updating the software in the car "over the air" before they completed their investigation and NHTSA apparently took Tesla's story uncritically.
Closing words to Mike Demler quoted in EE Times:
“NHTSA was obviously careful to apply a very narrow focus to this investigation,” Demler observed. “I’d say it was about 90% AEB, and only 10% Autopilot.”
NHTSA “could have dug into Autopilot more, at least as a warning to other manufacturers building similar systems,” he noted. In summary what they said was “we accept Tesla’s descriptions of how Autopilot works, the warnings they provide, and the fix they implemented.”
That’s pretty weak, Demler added.
The Florida Highway Patrol accident report on the May fatality.