Articles tagged with "Tesla-Autopilot"
Tesla Removed Autopilot. The Data Says Safety Wasn’t Lost - CleanTechnica
The article discusses Tesla's recent removal of Autopilot and Autosteer as standard features in North America, initially perceived by the author as a potential step back for safety and a move to push the Full Self Driving subscription. While Autopilot has been widely regarded as a safety-enhancing feature that reduces driver workload and smooths control, the author emphasizes that such assumptions require rigorous testing through large-scale, independent data rather than relying on driver perception or small datasets. Traffic safety outcomes like fatalities are extremely rare events (about one per 100 million miles), making it difficult to draw confident conclusions from limited data due to the "law of small numbers," where small samples can produce misleading results dominated by randomness. The author highlights the challenge of evaluating Autopilot’s safety using Tesla’s own published statistics, which compare crash rates with and without Autopilot engagement. These statistics are not independently verified and lack normalization for important factors such as road type, driver behavior, and exposure context. Since Autopilot
robotautonomous-vehiclesTesla-Autopilotdriver-assistance-systemstraffic-safetyself-driving-technologyautomotive-roboticsTechCrunch Mobility: RIP, Tesla Autopilot, and the NTSB investigates Waymo
The National Transportation Safety Board (NTSB) has launched an investigation into Waymo following reports that its robotaxis illegally passed stopped school buses multiple times in at least two states. This development adds scrutiny to Waymo’s autonomous vehicle operations amid growing regulatory attention. Meanwhile, Tesla made significant moves in its automated driving technology ahead of its quarterly earnings report. Tesla began offering front-seat robotaxi rides in Austin using a fleet of modified Model Y vehicles running an advanced version of its Full Self-Driving (FSD) software, moving toward broader deployment despite human safety operators still being present and chase vehicles following some cars. In a notable shift, Tesla discontinued its basic Autopilot system, which had been standard in all vehicles since 2014, and is now focusing solely on its more advanced, subscription-based Full Self-Driving software. This change comes shortly after Tesla stopped charging a one-time $8,000 fee for FSD, opting for a monthly subscription model instead. The move appears aimed at increasing
robotautonomous-vehiclesTesla-AutopilotWaymoAI-in-transportationdriver-assistance-systemsrobotaxiTesla faces scrutiny after US judge flags deceptive Autopilot claims
A California administrative law judge ruled that Tesla misled consumers through its marketing of Autopilot and Full Self-Driving (FSD) features, stating that the terminology falsely implied vehicles could operate autonomously without driver attention. The California Department of Motor Vehicles (DMV) adopted this ruling and gave Tesla 60 days to correct its marketing claims. If Tesla fails to comply, the DMV will enforce a 30-day suspension of the company’s license to sell vehicles in California, although factory operations will continue uninterrupted. The DMV emphasized that the action aims to protect consumers by ensuring clear and accurate communication about advanced driver assistance systems. Tesla responded by downplaying the ruling’s impact, noting that no customer complaints were filed and that sales in California would continue without disruption. The DMV clarified that its case was based on how a reasonable consumer might interpret Tesla’s advertising, not on individual complaints. Meanwhile, Tesla faces additional legal challenges, including a class action lawsuit in California alleging long-term deception about the capabilities of its self
robotautonomous-vehiclesTesla-Autopilotself-driving-technologydriver-assistance-systemsautomotive-regulationconsumer-protectionTesla engaged in deceptive marketing for Autopilot and Full Self-Driving, judge rules
An administrative law judge ruled that Tesla engaged in deceptive marketing by giving customers a false impression of the capabilities of its Autopilot and Full Self-Driving driver assistance software. This ruling stems from a long-running case initiated by California’s Department of Motor Vehicles (DMV), which accused Tesla of overstating the autonomy of its systems, leading to overconfidence that contributed to numerous crashes and fatalities. The judge agreed with the DMV’s request to suspend Tesla sales and manufacturing licenses for 30 days each but allowed Tesla 90 days to modify or remove misleading language before enforcing these penalties. Tesla has faced multiple investigations from California’s Attorney General, the Department of Justice, and the Securities and Exchange Commission over similar allegations of misleading marketing. The company has also settled several personal injury lawsuits related to crashes involving Autopilot. The ruling comes as Tesla advances its Robotaxi service testing in Austin, Texas, where it recently removed safety monitors from its test vehicles—vehicles that run different software than those sold to customers. A
robotautonomous-vehiclesTesla-Autopilotdriver-assistance-systemsRobotaxiautomotive-technologyself-driving-carsTesla challenges $243 million verdict in Autopilot death trial
Tesla has filed a motion seeking to overturn or retry a $243 million verdict against the company in a lawsuit related to a fatal 2019 crash involving its Autopilot system. The case arose after driver George McGee, operating a Tesla Model S with Autopilot engaged, failed to stop at a stop sign and collided with a parked SUV, killing 20-year-old Naibel Benavides Leon and severely injuring her boyfriend. The jury assigned two-thirds of the blame to McGee and one-third to Tesla. Tesla’s lawyers argue that the verdict contradicts Florida tort law and due process, emphasizing that McGee’s reckless behavior—specifically, reaching for his phone at the time of the crash—was the primary cause. In their court filing, Tesla contends that product liability should only apply when a vehicle performs in ways that defy consumer expectations or are unreasonably dangerous, which they claim is not the case here. They warn that upholding the verdict could stifle innovation
robotautonomous-vehiclesTesla-Autopilotdriver-assistance-systemsproduct-liabilitytransportation-technologyautomotive-safetyTesla could have avoided that $242.5M Autopilot verdict, filings show
In a recent federal court case in Miami, Tesla was found partially liable for a fatal 2019 crash involving its Autopilot system, resulting in a $242.5 million jury verdict against the company. The crash occurred when a Tesla Model S with Autopilot engaged failed to brake at an intersection and collided with a Chevrolet Tahoe, killing Neima Benavides Leon and severely injuring Dillon Angulo, who were standing outside the vehicle. The jury apportioned two-thirds of the blame to the driver and one-third to Tesla. Tesla plans to appeal the verdict, citing significant legal errors and trial irregularities. Newly revealed legal filings show that Tesla had the opportunity to settle the case for $60 million months before the verdict but declined the offer. The lawsuit, filed in 2021 in the U.S. District Court for the Southern District of Florida, focused on Tesla’s Autopilot system’s failure to prevent the crash. Tesla’s communications team has been disbanded, and
robotautonomous-vehiclesTesla-Autopilotdriver-assistance-systemsautomotive-technologylegal-issues-in-roboticsvehicle-safety-systemsMiami Jury Finds Tesla Liable For Deadly Crash — Awards $329 Million In Damages - CleanTechnica
A Miami jury found Tesla partially liable for a deadly 2019 crash involving a 2019 Tesla Model S driven by George McGee, who had activated the Autopilot system but was manually accelerating and distracted by searching for his phone when the vehicle ran a stop-controlled T intersection. The Tesla crashed into a Chevy Tahoe, which then struck two pedestrians, killing one and severely injuring the other. The estate of the deceased sued Tesla, arguing that the Autopilot system failed to slow or stop the car at the intersection. Tesla countered that once the driver manually accelerated, many of Autopilot’s safety features were overridden and placed full blame on the driver’s negligence. After a two-week trial, the jury apportioned fault as two-thirds to the driver and one-third to Tesla, awarding $129 million in actual damages plus $200 million in punitive damages, totaling $329 million. The punitive damages were intended to punish Tesla for allegedly misleading marketing and unsafe deployment of Autopilot beyond controlled
robotautonomous-vehiclesTesla-Autopilotself-driving-technologyautomotive-safetyAI-in-transportationdriver-assistance-systemsTesla partly liable in Florida Autopilot trial, jury awards $200M in damages
A federal jury in Miami found Tesla partly liable for a fatal 2019 crash involving its Autopilot driver assistance system, assigning one-third of the blame to Tesla and two-thirds to the driver. The crash occurred when neither the driver nor Autopilot braked in time at an intersection, resulting in the death of 20-year-old Naibel Benavides Leon and severe injury to her boyfriend. The jury awarded approximately $242.5 million in total damages, including punitive damages solely against Tesla. This verdict marks one of the first major legal rulings against Tesla regarding its Autopilot technology, which the company has previously addressed through settlements. Plaintiffs' lead attorney criticized Tesla for marketing Autopilot as suitable beyond controlled-access highways without restricting its use, accusing the company and Elon Musk of fostering overconfidence in the system that endangered lives. Tesla announced plans to appeal, arguing the verdict was legally flawed and that no vehicle in 2019 could have prevented the crash, emphasizing that the
robotautonomous-vehiclesTesla-Autopilotdriver-assistance-systemsautomotive-safetyself-driving-technologylegal-liabilityJury orders Tesla to pay $243M in deadly Autopilot crash case
A federal jury in Miami has found Tesla partly liable for a 2019 crash in Key Largo, Florida, that killed 22-year-old Naibel Benavides Leon and severely injured her boyfriend. The crash occurred when driver George McGee, distracted by a dropped cell phone, ran a stop sign at 62 mph while relying heavily on Tesla’s Autopilot system, which failed to warn or brake automatically. The jury ordered Tesla to pay $243 million in damages, marking a rare legal defeat for the company amid its efforts to launch a driverless taxi service. Tesla plans to appeal the verdict, maintaining that McGee’s reckless behavior was solely to blame and emphasizing its repeated warnings for drivers to stay attentive. Plaintiffs’ lawyers argued that Tesla enabled reckless use of Autopilot by not restricting its operation on unsuitable roads and failing to disengage the system when drivers were distracted. They also accused Tesla of misleading customers through branding and withholding or losing critical crash data, which was later recovered by
robotautonomous-vehiclesTesla-Autopilotdriver-assist-systemsautomotive-safetysemi-autonomous-technologycrash-liabilityTesla partly liable in Florida Autopilot trial, jury awards $329M in damages
A Miami federal jury found Tesla partly liable for a fatal 2019 crash involving its Autopilot driver assistance system, awarding $329 million in punitive and compensatory damages to the plaintiffs. The crash occurred when neither the driver nor Autopilot braked in time at an intersection, resulting in a collision that killed 20-year-old Naibel Benavides Leon and severely injured her boyfriend. The jury assigned two-thirds of the blame to the driver and one-third to Tesla. This verdict marks one of the first major legal rulings against Tesla regarding Autopilot, a technology the company has previously defended or settled related lawsuits over. Plaintiffs’ lead attorney Brett Schreiber criticized Tesla for designing Autopilot primarily for controlled highways but allowing its use elsewhere, coupled with Elon Musk’s public claims that Autopilot outperforms human drivers. Schreiber argued Tesla’s misleading promotion of the system endangered users and contributed to the fatal crash. Tesla announced plans to appeal the verdict, calling it legally
robotautonomous-vehiclesTesla-Autopilotdriver-assistance-systemsautomotive-safetyself-driving-technologylegal-liabilityTesla partly liable in Florida Autopilot trial, jury awards $200M punitive damages
A federal jury in Miami found Tesla partially liable for a fatal 2019 crash involving its Autopilot driver assistance system. The crash occurred when neither the driver nor the Autopilot system braked in time at an intersection, resulting in a collision with an SUV that killed pedestrian Naibel Benavides Leon and severely injured her boyfriend. The jury assigned two-thirds of the blame to the driver and one-third to Tesla, awarding the plaintiffs $200 million in punitive damages along with compensatory damages for pain and suffering. This verdict marks one of the first significant legal rulings against Tesla concerning its Autopilot technology. The trial lasted three weeks and highlights growing scrutiny over the safety and accountability of driver assistance systems. The driver involved was sued separately, and the case is ongoing, with further developments expected.
robotautonomous-vehiclesTesla-Autopilotdriver-assistance-systemstransportation-technologylegal-issues-in-roboticsautomotive-safetyTesla Autopilot Crash Trial — Days 6 & 7 - CleanTechnica
The Tesla Autopilot crash trial, the first third-party wrongful death case against Tesla, is underway in Miami’s federal courthouse, with significant testimony heard on days 6 and 7. On day 6, Tesla technician Michael Callafel testified that he was not qualified to retrieve Autopilot data from the crashed vehicle and had never done so before, admitting that no one in Tesla’s service department is authorized to pull Autopilot logs. Callafel also acknowledged that an affidavit he signed, prepared by Tesla’s legal team, contained inaccuracies due to his oversight. Tesla driver George McGee admitted to becoming overly comfortable with the Autopilot system, believing it would assist him and prevent accidents, but stated that the system failed to warn him or apply brakes before the crash. Medical testimony focused on Dillon Angulo, the crash victim. Dr. Danielle Horn, a pain management specialist, described Angulo’s chronic pain conditions and diminished quality of life, noting that his pain was resistant to treatment and likely
robotautonomous-vehiclesTesla-Autopilotautomotive-technologyAI-in-transportationvehicle-safety-systemsself-driving-carsTesla Autopilot Crash Trial — Highlights from Opening Days - CleanTechnica
The trial in Florida concerning a fatal crash involving a Tesla vehicle operating on Autopilot began on July 14. The case centers on the claim by the estate of Benavides Leon, a bystander who died in the crash, that Tesla’s Autopilot system malfunctioned and was a proximate cause of the incident. Over the first three days, testimonies were heard from emergency responders, a Florida Highway Patrol officer, and an expert statistician, alongside depositions from a Tesla Autopilot firmware engineer. Notably, Corporal David Riso, the lead investigator, testified that Tesla did not provide the autonomous driving data from the vehicle, with a technician claiming the file was corrupted—a statement Riso disputed as untrue. Expert witness Dr. Mendel Singer criticized Tesla’s Vehicle Safety Report, highlighting a lack of independent validation and discrepancies in how Tesla counts crashes compared to non-Tesla vehicles. He pointed out that Tesla’s data incorrectly lumps all vehicle types together and that the company
robotautonomous-vehiclesTesla-Autopilotself-driving-technologyvehicle-safetyautomotive-roboticsAI-in-transportationFlorida Judge Denies Tesla Motion For Summary Judgement In Wrongful Death Suit - CleanTechnica
On April 25, 2019, George McGee was driving his 2019 Tesla Model S near his home in Key Largo, Florida, when he failed to slow at a T intersection and crashed into a Chevy Tahoe, killing a bystander and severely injuring another. McGee had activated Tesla’s Autopilot and set Traffic Aware Cruise Control (TACC) to 45 mph but manually increased the speed to 62 mph before the crash. He admitted to searching for his dropped cell phone at the time of the accident and reported no visual or audible warnings from the car prior to the collision. The estate of the deceased contends that Tesla’s Autopilot malfunctioned and was a proximate cause of the crash, particularly focusing on the system’s behavior after McGee manually overrode the TACC speed. Tesla filed a motion for summary judgment arguing it was not legally responsible, claiming that certain Autopilot features, like forward emergency braking, are disabled when the driver accelerates
robotautonomous-vehiclesTesla-Autopilotself-driving-carsautomotive-safetytraffic-aware-cruise-controlvehicle-automation