It has happened, a class action lawsuit has been filed regarding Tesla Autopilot, Enhanced Autopilot, and Full Self-Driving Capability. The people suing argue that Tesla misled them with deceptive marketing that made the Tesla buyers think their cars would be better able to drive themselves.
“The lawsuit was filed in the Northern District of California, Case No. 3:22-cv-05240 by Plaintiff Briggs Matsko of Sacramento, California,” an email to CleanTechnica states.
Tesla Statements About Present Tech and Forecasts About The Future Of Autopilot/Full Self Driving
The email I received about the news states: “The complaint alleges that since at least 2016, Tesla has misleadingly and deceptively marketed its supposed autopilot and full self-driving technology as either already fully functional or just around the corner. As Tesla states in a video first published on its website in October 2016 that purports to show a Tesla car driving itself:
‘The person in the driver’s seat is only there for legal reasons. He is not driving anything. The car is driving itself.’
“The New York Times would later report that the video was doctored to exclude, among other things, the Tesla crashing into a barrier.”
I would like to separate this part from the rest of the argument for a moment. Indeed, Tesla — like most companies — highlighted one of its products/features in a way that made it seem a bit better than it was. I’ve seen other companies do it with driver-assist technology as well. However, did Tesla do it in an extreme way that truly misled buyers? Well, I think that’s the matter for the judge or jury to decide. The line ‘The person in the driver’s seat is only there for legal reasons. He is not driving anything. The car is driving itself.’ does seem a bit damning since the car truly did need human supervision except for very limited use, but I am not aware of how these kinds of statements balanced by other statements and disclosures are parsed legally.
Did buyers truly think their cars would be able to fully drive themselves at the time of purchase based on Tesla marketing? Given the various disclaimers on the website, in the car’s Autopilot settings, when engaging Autopilot, etc., I have a hard time believing that people were misled. Though, I also have a hard time understanding how so many people don’t consider Donald Trump a total con man. It’s hard to understand and believe. So, perhaps some buyers genuinely felt misled regarding Tesla’s self-driving capabilities because of things like the video above or the promise of a coast-to-coast, LA-to-NYC, fully autonomous road trip (something Elon Musk said Tesla would do in 2017).
There’s a different matter of future capabilities, including missed targets regarding the future that is now past. When I bought my Tesla Model 3 in August 2019, a big part of the purchase was that the car’s autonomous features were supposed to be getting much better, and thus the cost of the “Full Self Driving” suite would be going up in price, and it was smarter to buy the car and this software package sooner than later. In particular, the last I had heard, Elon Musk was adamant that cars with this package were supposed to have “feature complete” Full Self Driving (FSD) by the end of the year. This was long the goal, and Elon was convinced Tesla would achieve it. What that meant was that the car would be able to drive from address A to address B on its own, but would not be perfect at doing so and thus would still need human supervision until the system got much better than a human at making such drives. I definitely didn’t expect my car to be robotaxi capable by 2020, but I definitely thought it would have this feature-complete driving ability by then based on what Elon had said. He was sure they’d be at that stage by then. It ended up that the Tesla Autopilot team (“Tesla Autopilot” is the broad umbrella term for all of Tesla’s semi-autonomous driving features, and a robust team of experts make up the core Autopilot team) decided that they had hit a “local maximum” with their line of development and couldn’t achieve their end goal using the method they had been using. So, they rewrote the code in order to go another way — software-wise — which took several months. That new method also seemed harder than expected and new targets of the same sort of capability by the end of 2020 came and went. I got this feature-complete FSD in my car in late 2021 after passing a fairly difficult “safe driving” test (Safety Score). It took most others who had bought FSD several months before they got the features. In fact, many still don’t have FSD (beta). Whereas Elon had predicted robotaxi-capable Teslas would be on the road in 2020, and 2021, and 2022, we still aren’t close to that.
Were Elon Musk and Tesla misleading buyers about future capabilities of Tesla cars with FSD all through this period? I think it’s hard to argue they weren’t. However, were there disclaimers that they didn’t know for sure when they’d achieve certain capabilities? Yes, there were. Were Elon and team misleading people on purpose, or did they just have too much faith in their approach and their skills? I definitely believe it was the latter, but I know many people think it was the former, and I think I see why they believe that. Does any of this tell us who will win the class action lawsuit? Well, I can’t tell, but I’m not a lawyer and I also definitely do not have expertise in this area of the law.
Some final notes on my story: I do not feel ripped off by this series of events. Again, I believe Elon expected to hit those targets that they missed. Also, as new features have been added, the cost of the FSD package I have has gone up. I bought it for $6,000, while it now costs $15,000. That’s $9,000 of appreciation. (That said, it reportedly doesn’t really add much of a premium on the use car market and the “take rate” — the percentage of new Tesla buyers who buy the package — has reportedly dropped.)
There is a separate matter here. There were many videos of Tesla cars driving themselves while the driver slept — or “slept” — or while the driver was in the back seat or passenger seat. These kinds of videos may have misled many buyers, but I don’t recall ever seeing Tesla or Elon promote or endorse such videos.
The email about the lawsuit mentions a couple of investigations. “People have suffered fatal and other serious injuries as a result of Tesla’s autopilot and self-driving technology, triggering investigations by the National Highway Traffic Safety Administration, the National Transportation Safety Board, and other regulators.
“On July 28, 2022, the California Department of Motor Vehicles filed an accusation against Tesla for making statements that are untrue or misleading about vehicles equipped with its autopilot and FSD technology. They are seeking to suspend or revoke Tesla’s vehicle dealer and manufacturing licenses and potentially require Tesla to pay restitution.”
I haven’t yet seen proof that Tesla Autopilot or FSD are less safe than not using them, and Tesla has presented evidence (though a bit limited and one could say incomplete) that Teslas with Autopilot on are safer. In any case, I’m not sure what relevance a couple of open investigations have for this class action lawsuit. Investigations can be started for a variety of good and nefarious, or at least misguided, reasons, and until the investigations conclude, it doesn’t seem logical — especially legally — to jump to conclusions.
Current State of Tech
The last point from the plaintiffs: FSD currently sucks. “The lawsuit filed today alleges that Tesla has yet to produce a fully self-driving car. Tesla owners receiving the latest ‘updates’ to Tesla’s Autopilot software and FSD beta software have reported myriad problems, such as cars having difficulty making routine turns, running red lights, and steering into oncoming traffic. There have also been numerous collisions involving Tesla’s purportedly cutting-edge software, including vehicles crashing at high speeds into large stationary objects such as emergency vehicles and an overturned box truck.” In other words, buyers were sold the FSD suite on the premise that it could do things well that it actually struggles to do, and sometimes even does dangerously.
I’m again not so sure what kind of legal jeopardy actually exists here. There are plenty of statements from Tesla that FSD is not perfect and needs to be monitored carefully. In fact, Tesla has prioritized drivers who get good Safety Scores for access to the features. There are no claims I’m aware off that the cars can truly drive themselves from point A to point B unsupervised or that issues don’t crop up on many drives. “As alleged in the complaint, people have relied upon the representations of TESLA that the self-driving capabilities are completely safe when TESLA knew they had many problems,” Joe Cotchett, a Partner of Cotchett, Pitre & McCarthy, says. I don’t recall any claim that Tesla’s self-driving capabilities are completely safe, but maybe I missed that. It would be interesting to see more of the evidence the legal team uses, though.
What Will Happen With This Class Action Lawsuit Against Tesla?
I am not at all surprised this class action lawsuit was initiated. I expected something like this (though, honestly, a little bit different) for the past couple of years. Many Tesla customers are disgruntled, and many have other reasons for criticizing Tesla and trying to harm it. But I have no idea who will win this case. I would assume Tesla had enough disclaimers and warnings in place, but Elon is also a very “shoot from the hip” guy on statements on this topic, and has a tendency to push the edge of risk. Legal cases like these are often about legalese and parsing words or actions that may not seem like the biggest deal to a layperson. We’ll see.
Appreciate CleanTechnica’s originality and cleantech news coverage? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.