Tuesday, September 19, 2017

Tesla's Push for Self-Driving Cars Driving Off Engineers

Tesla CEO Elon Musk has famously been pushing for "auto pilot" and fully autonomous cars, saying in October of 2016 that all new Teslas would have self-driving capabilities.  I've been covering some of the autopilot crashes of those self-driving cars as data comes around.  By that I mean that I don't hang out on Tesla user's groups looking for stories of accidents.  On the other hand, the technology needs to get some light focused on it.  It's not ready for Prime Time and Tesla finally seems to be acknowledging that.

This week, Design News presented an article with a few "fun facts" that indicate US consumers are really skeptical about autonomous cars.  First, a Gartner's Survey shows that 55% of respondents would not ride in an autonomous car.
The Gartner Consumer Trends in Automotive online survey, conducted from April 2017 through May 2017, and polled 1,519 people in the U.S. and Germany, found that 55 percent of respondents will not consider riding in a fully autonomous vehicle, while 71 percent may consider riding in a partially autonomous vehicle.
It turns out people seem to have taken lessons from things like the Equifax data breach - along with so many others - to heart and are concerned about security along with the general category of "technology failures".
"Fear of autonomous vehicles getting confused by unexpected situations, safety concerns around equipment and system failures and vehicle and system security are top concerns around using fully autonomous vehicles," explains Mike Ramsey, research director at Gartner.
Gartner isn't unique in finding this; AAA found similar results.
A new report from AAA reveals that the majority of U.S. drivers seek autonomous technologies in their next vehicle, but they continue to fear the fully self-driving car. Despite the prospect that autonomous vehicles will be safer, more efficient and more convenient than their human-driven counterparts, three-quarters of U.S. drivers report feeling afraid to ride in a self-driving car, and only 10 percent report that they’d actually feel safer sharing the roads with driverless vehicles.
Three-quarters of drivers are afraid to ride in a self-driving car?  That's quite a bit more than the 55% Gartner found, but those numbers are more like each other than not.  There's a lot of anxiousness about the technology.  Still, AAA reports that while the majority are afraid to ride in a fully self-driving vehicle, the survey also found that a bigger majority (59%) wants to have autonomous features in their next vehicle.  This apparent contradiction suggests that drivers are ready to embrace autonomous technology, but they are not yet ready to give up full control.  People are responding with a skeptical "prove to me it works" attitude. 

I wasn't aware of the impact at Tesla itself until the Design News article.  Summarized in Inc. Magazine,
Musk's fervent vision of a driverless world repeatedly clashed with his team's view on the progress of the technology. The dispute led Sterling Anderson, then Autopilot director, to resign just two months after the October announcement. Since his departure, at least 10 other engineers and four top managers -- including Chris Lattner, who had been selected as Anderson's replacement and lasted just six months on the job -- have also left.

This isn't the first time dissent among Tesla engineers has been reported. In July 2016, after a fatal crash killed a driver using the Autopilot function, CNN reported that several employees tried to warn Musk about the reliability of the technology, and that it wasn't ready to be deployed to consumers.
The team working on Tesla's Autopilot feature felt uneasy about marketing the technology as "fully autonomous."  They felt they didn't yet have a product that could "safely and reliably control a car without human intervention."  In other words, they felt like the 55 or 79% of consumers surveyed who didn't want to trust their lives to the self-driving cars. 

This is called engineering ethics.  We're always required to voice our opinions and point out unsafe or improper things.  At some point, if the company doesn't respond, the only ethical thing left is to quit. 
The driver of this Tesla says that when she was parking in front of her gym, it jumped into high acceleration, jumped the curb and hit the wall.  Tesla flatly says "nope", their monitors say she stepped on the gas.  Could be.  Hey - it's not your typical Tesla picture, that's all I was looking for.


  1. Tech advancements are great, I loved seeing it in aircraft. Auto pilot, auto braking, auto land but I would not fly in an aircraft with no pilot. Technology can and will inevitably fail and even with pilot error I will rely on a human pilot over an electronic pilot. I am still not sure why all the effort to have self driving cars when I see bigger advantages to a far more automated rail system run on electric propulsion rather than the current diesel electric. With modified diesel electric engines the only advantage I could see is using spare locomotives to provide backup power in disaster scenarios. indyjonesouthere

  2. It will come, eventually. Patience and perseverance are called for.

  3. Unless Tesla's monitors include video proof of the driver in the pictured accident stomping on the gas, then all they've done is document the exact independent acceleration problem alleged by the driver, and assumed driver error.
    As has been proven in aviation, and written in blood, that approach is inherently suspect. (Airbus, call your office.)

    In short, video or it didn't happen, Tesla.

    "Cognitive dissonance" is a fancy technical term for being a dumbass, the same week 143 million people have their data hacked, with anyone claiming that "this time, it'll be different" with self-crashing cars.

    And financially, even if they're safer than drivers, such a model shifts the liability for any fatalities afterwards entirely onto the manufacturers, in perpetuity. Well-played.
    Now imagine the Titanic or Hindenburg disasters in the current litigation environment, and predict the future of sea liners and dirigibles, and then extrapolate the future of personal automobile transportation.
    (Gee, almost like this was being pushed by the exact people who hate carbon emissions, and personal freedom, and want everyone herded into public transport. Hmmm...).

    But it's simply solved: require the dealers and CEOs to ride around strapped on the bumpers of self-driving cars for the first year after adoption. Then re-visit the issue in 12 months' time.

    1. I thought it was telling that Tesla immediately blamed the driver. "Our monitoring software said our driving software didn't make a mistake." Excuse me? So just how would I know if either of those software packages are good? Would they pass an independent software quality audit?

      But you bring up the big, good point. There has been at least one fatality in a Tesla on Autopilot. When do the liability suits start? I can only imagine all the Ambulance Chasers salivating over the deep pockets they can go after. I can't imagine the car makers are so stupid they haven't planned for that.

    2. Blaming the driver and coming to immediate concrete conclusions of human fault remind me of Toyota and how they pulled legal and international tricks to avoid being blamed for their acceleration problems.
      One thing rarely mentioned in the news is that Toyota uses a proprietary chipset and OS in their car computers; only they can read it and interpret the data (conveniently only in Japan, outside of US jurisdiction), and only they know how it is programmed. Not only did they present different data sets in different trials as the full recording of the system, they didn't program in basic safety checks such as 'if the brake pedal and throttle are both fully applied, back off the throttle'.

  4. "their monitors say she stepped on the gas."

    You expected otherwise?
    The MFG's history analysis of a legal problem?

    BTW, How do you "step on the gas" in an electric car?

  5. I have also been concerned about the hype of self driving exceeding the reality; both unmanned aircraft (in popular parlance, 'drones' and almost exclusively remotely operated augmented quad copters) and self driving cars are being hyped well beyond their capabilities by people who ignore or hand wave away any concerns about safety and instead paint rosy pictures of future potential.
    Ignoring the real science and engineering issues is going to lead to liabilities, deaths, and setting the industry back compared to where it would have been with a realistic understanding of the current state of technology.
    In my experience, some of these 'assisted driving' abilities are too immature to be fielded at this point - for example, 2 years ago I had a rental car with automatic braking; when I went around a sharp curve, it applied the brakes unnecessarily because it saw items off the road. I dialed the feature back as far as I could because of how annoying it was.