Critics argue that Tesla has published misleading safety claims about its Autopilot driver-assistance system, and that Tesla cars are actually less safe with Autopilot activated.[183][184][185][186] Tesla’s “public beta” release of Autopilot has been called unsafe and irresponsible, as critical safety features aren’t thoroughly tested before being released to consumers.[187] The National Transportation Safety Board has criticized Tesla for neglecting driver safety, calling certain Autopilot features “completely inadequate”,[188] and cited Autopilot as the probable cause of multiple deadly crashes involving Tesla vehicles.[189] A 2019 study found that Autosteer increased the odds of airbag deployment by a factor of 2.4.[190] A 2020 study found that drivers were more distracted when they used Autopilot, and the researchers called on Tesla to take more steps to ensure drivers stay attentive.[191] Another 2020 study identified significant inconsistencies, abnormalities, and unsafe behavior with Autopilot on three Tesla Model 3 cars.[192] Numerous videos have shown misuse and apparent malfunctions of Autopilot leading to collisions,[193][194] and between 2016 and 2022 at least fifteen fatalities have involved the use of Autopilot, nine of which occurred in the United States.[195]
The Center for Auto Safety and Consumer Watchdog have criticized Tesla for what they believe are deceptive marketing practices related to Autopilot.[196] Studies by AAA and the Insurance Institute for Highway Safety have shown the name “Autopilot” to be misleading, causing drivers to think the system is safer than it actually is.[197][198][199] A German court ruled in 2020 that Tesla had misled consumers by using the terms “Autopilot” and “Full Self Driving”.[52]
As of March 2021, the NHTSA was investigating 23 recent accidents involving Tesla vehicles that may have been on Autopilot.[200] Tesla’s Autopilot technology has struggled to detect crossing traffic and stopped vehicles, including stationary emergency vehicles, which has led to multiple fatal crashes.[201][202] (Tesla released an “Emergency Light Detection” over-the-air update to Autopilot in September 2021, and the NHTSA questioned why it didn’t issue a recall.[203]) Additionally, an MIT study published in September 2021 found that Tesla Autopilot is not as safe as it claims and leads to drivers becoming inattentive from regular use of the system.[204][205]
In February 2022, NHTSA began an investigation of phantom braking at highway speeds after 354 complaints from customers concerning a group of about 416,000 Tesla vehicles.[206] The complaints describe rapid deceleration that can occur repeatedly without warning and apparently at random. One owner of a 2021 Tesla Model Y reported a violent deceleration to the NHTSA from 80 mph to 69 mph in less than a second.[207] In May 2022, the NHTSA said in a letter that they had received over 750 complaints about this issue.[208]
In June 2022, NHTSA announced it was investigating 16 instances in which Autopilot shut off less than a second before a collision. Fortune suggested this “might indicate the system was designed to shut off when it sensed an imminent accident”. Fortune also pointed out that Musk has frequently claimed that “accidents cannot be the fault of the company, as data it extracted invariably showed Autopilot was not active in the moment of the collision”.[209] Senator Ed Markey praised the NHTSA investigation, criticizing Tesla for disregarding safety rules and misleading the public about its “Autopilot” system.[210] In April 2024, the NHTSA released the findings of its 3-year investigation of 956 vehicle collisions in which Tesla Autopilot was thought to have been in use that found that the system had contributed to at least 467 collisions including 13 that resulted in fatalities.
the narrative would be "stocks go down on concerns for AI growth" or some bullshit like that
it doesn't need to be real, just make people scared enough to sell. Remember when MIT report caused mkt dump? After that we went to the moon anyways