SAN FRANCISCO, Sept 28 (Reuters) - Opening statements are set to begin on Thursday in the first U.S. trial over allegations that Tesla’s (TSLA.O) Autopilot driver assistant feature led to a death, and its results could help shape similar cases across the country.
The trial, in a California state court, stems from a civil lawsuit alleging the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour (105 kph), strike a palm tree and burst into flames, all in the span of seconds.
The 2019 crash killed Lee and seriously injured his two passengers, including a then-8-year-old boy who was disemboweled, according to court documents. The lawsuit, filed against Tesla by the passengers and Lee’s estate, accuses Tesla of knowing that Autopilot and other safety systems were defective when it sold the car.
Tesla has denied liability, saying Lee consumed alcohol before getting behind the wheel. The electric-vehicle maker also claims it was not clear whether Autopilot was engaged at the time of crash.
Tesla has been testing and rolling out its Autopilot and more advanced Full Self-Driving (FSD) system, which Chief Executive Elon Musk has touted as crucial to his company’s future but which has drawn regulatory and legal scrutiny.
Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the “Autopilot” name. A Model S swerved into a curb in 2019 and injured its driver, and jurors told Reuters after the verdict that they believed Tesla warned drivers about its system and that driver distraction was to blame.
The stakes are higher in the trial this week, and in other cases, because people died. Tesla and plaintiff attorneys jousted in the runup about what evidence and arguments each side could make.
Tesla, for instance, won a bid to exclude some of Musk’s public statements about Autopilot. However, attorneys for the crash victims can argue that Lee’s blood alcohol content was below the legal limit, according to court filings.
The trial, in Riverside County Superior Court, is expected to last a few weeks.
Except that Tesla does claim that they’re autonomous self-driving. They’re even among the group pushing to be allowed to sell cars with no driver controls.
Not only should Tesla’s executives be held personally liable, I’d probably also jail whichever regulator let them get away with it.
https://www.tesla.com/en_eu/support/autopilot
they really don’t say that. I mean they advertise with it, sure. But always when it actually comes down to it, tesla admits it’s an assistive feature that requires constant attention.
and you get warnings source (or here ) when you first sign up, as well as reminders when the car detects that you don’t follow the requirements.
So, the software doesn’t actually do anything, it just gives the illusion that it does. That’s sounds safe.
If you are relying on T&C as a get out of jail free card for your safety system, then it isn’t a safety system.
That’s how every safety system works.
You define the necessary conditions in which it works, and guarantee (with testing and validation) that in those conditions it does its job.
Nothing works unconditionally.
The Conditions in this case are in fact, that it is an assistance system, and not a safety system, because everybody knows it can’t be relied upon. It probably works >99% of times, which just isn’t (nearly) enough for driving.
Yeah, I’ve been working in aerospace, automotive, industrial and rail safety for over 20 years. You don’t get to say “this software does thing” and then in the safety manual say “you don’t get to trust that the software will actually do thing”.
Further, when you claim the operator as a layer of protection in your safety system, the probability of dangerous failure is a function of the time between the fault (the software doing something stupid) and the failure (crash). The shorter that time, the less safe the system is.
Here’s a clue: Musk doesn’t know anything about software safety. Their lead in autonomous technology has less to do with technical innovation and more to do with cutting corners where they can get away with it.
My guy, the feature is literally named “Autopilot”. By definition they are advertising it as a system that can PILOT the car AUTOMATICALLY. It doesn’t matter what they put in the fine print, they are blatantly describing the system as autonomous.