Two men died on Saturday night when a 2019 Tesla Model S driving at high speed failed to negotiate a curve on a windy road in Spring, Texas.
Harris County Precinct 4 Constable Mark Herman said in an interview that, based on a preliminary investigation, there was no evidence anyone was at the wheel of the vehicle at the time of the crash.
“Our preliminary investigation is determining-but it’s not complete yet-that there was no one at the wheel of that vehicle. We’re almost 99.9% sure,” the constable said, per the Wall Street Journal.
The NHTSA has launched more than two dozen advanced driver-assistance-related investigations into crashes involving Tesla vehicles amid the company’s autonomous driving push.
The National Transportation Safety Board’s chairman, Robert L. Sumwalt, has also criticized Tesla on a number of occasions for their lack of compliance with regulators regarding self-driving tech.
In a 2020 NTSB meeting, Sumwalt said, “It is foreseeable that some drivers will attempt to inappropriately use driving automation systems. To counter this possibility, in 2017 we issued 2 recommendations to 6 automobile manufacturers. Five manufacturers responded favorably that they were working to implement these recommendations. Tesla ignored us.”
Just last week, Ford’s CEO also took a shot at Tesla on Twitter when announcing his company’s driverless tech, saying “BlueCruise! We tested it in the real world, so our customers don’t have to.”
Despite the recent crash, NHTSA investigations, and criticism from the competition, Tesla’s recent safety report shows its cars are some of the safest on the road.
In the first quarter of 2021, Tesla registered one accident for every 4.19 million miles driven when Autopilot was engaged. For comparison, The NHTSA’s most recent data shows there is an automobile crash every 484,000 miles on average in the US.
Even with a strong safety record, high-profile crashes like this one usually hurt Tesla’s stock.
The bearish fatal crash news also comes as Cathie Wood, one of Tesla’s biggest supporters, continues to sell shares.
When Tesla beamed out a prototype version of its “Full Self-Driving” (FSD) technology to select Tesla owners in October, videos of the driver-assistance system fumbling normal traffic situations – from failing to follow road rules to nearly steering into parked cars and cement barricades – flooded the web.
Now Elon Musk wants any Tesla owner that has paid for FSD to have access to the beta next month. But clips cropping up online continue to cast doubt on whether the technology is safe enough to test on public streets.
A March 12 video posted by Youtube user AI Addict shows a Model 3 running FSD beta version 8.2 clumsily navigating around downtown San Jose at dusk. With FSD switched on, the vehicle nearly crashes into a median, attempts to drive down railroad tracks, and almost plows down a row of pylons separating the road from a bike lane. All of those dicey situations were narrowly avoided only because the driver quickly took over control.
In another clip posted March 18, Model Y owner Chuck Cook tests the beta’s ability to make unprotected left turns. The software performs admirably a few times, waiting until a break in traffic to cross the three-lane road. More than once, however, Cook has to slam on the brakes to avoid coasting into oncoming traffic. And on his last go, the Tesla nearly drives headlong into a pickup truck with a boat in tow.
FSD testing videos have become an entire genre on YouTube. Many of them depict cars comfortably navigating lane changes, four-way stops, and busy intersections. Yet the buggy clips illustrate the potential dangers of letting amateur drivers experiment with a prototype software on public roads.
Tesla is using its owners as “guinea pigs for the technology,” Jason Levine, executive director of the Center for Auto Safety, a consumer advocacy group, told Insider. “And what’s much more concerning, quite frankly, is they’re using consumers, bystanders, other passengers, pedestrians, and bicyclists as lab rats for an experiment for which none of these people signed up.”
FSD – a $10,000 add-on option – is a more advanced version of Tesla’s Autopilot, its standard driver-assistance feature that enables cars to maintain their lane and keep up with highway traffic using a system of cameras and sensors. FSD currently augments Autopilot with features like self-parking, traffic light and stop sign recognition, and the ability to take highway on-ramps and exits.
The limited beta software in question adds on a capability critical for any system that aims to be called fully self driving: the ability to navigate local streets, which, as opposed to highways, have a much more complex driving environment that includes left-hand turns across traffic, pedestrians, cyclists, and the like.
Even before it introduced the FSD beta last fall, Tesla faced scrutiny over Autopilot and its potential for abuse. The National Highway Traffic Safety Administration confirmed earlier this month that it is investigating Autopilot’s role in 23 recent crashes, including multiple where Teslas barreled into stopped emergency vehicles. Over the years, numerous videos have surfaced on social media of drivers sleeping with Autopilot turned on or otherwise misusing the feature.
To make things safer, Levine said, Tesla could begin using vehicles’ internal cameras to monitor driver attention as many other carmakers do. Currently, Tesla only monitors whether a driver’s hand is on the steering wheel, while other systems, like GM’s Super Cruise, track a driver’s eyes to make sure they are paying attention to the road.
Changing the names of Autopilot and FSD – which are misleading since neither technology is autonomous and both require constant driver attention – would be a good start as well, Levine said.
“The insistence on utterly hyperbolic description really undermines any sort of good-faith effort to present this technology in a way that is going to not present an unreasonable risk,” he said.
For Tracy Pearl, a law professor at the University of Oklahoma who researches self-driving technology, the main problem isn’t so much the quality of Tesla’s driver-assistance systems, but rather the way drivers interact with them.
Although advanced driver-assistance suites like Tesla’s could make cars safer when used properly, research has shown that drivers on the whole don’t understand their capabilities and limitations, Pearl said. Moreover, drivers’ attention tends to wander when those features are switched on. Tesla exacerbates these issues by marketing its tech in ways that overstate the cars’ abilities, but the information gap between manufacturers and drivers extends to other carmakers as well, she said.
Problems with the way driver-assistance systems are marketed and the way drivers interact with them are heightened where a beta system is concerned, she said.
“I think calling it Full Self Driving is not just deceptive, I think it is an invitation for people to misuse their cars,” she said.
Tesla, though it consistently claims that self-driving cars are right around the corner, acknowledges that both Autopilot and FSD require constant driver attention and aren’t fully autonomous. Still, it’s much more blunt with regulators than it is with the general public.
In a series of emails to California’s Department of Motor Vehicles in late 2020, a Tesla lawyer said that FSD can’t handle certain driving scenarios and that it will never make cars drive themselves without any human input.
Tesla also makes an effort to inform users about the risks of using a system that isn’t completely ready for prime time. In a disclaimer beta testers received with the October update, Tesla urged drivers to use the software “only if you will pay constant attention to the road, and be prepared to act immediately.
“It may do the wrong thing at the worst time,” Tesla said.
Tesla did not respond to a request for comment for this story.
Tesla was asked by the National Highway Traffic Safety Administration to recall around 158,000 vehicles over faulty touchscreens, the agency said in a letter to the company Wednesday.
The NHTSA said the media control units on certain Tesla vehicles failed after their memory ran out, causing issues with the backup camera, defogging and defrosting settings, Autopilot system, and turn signals.
The issue impacted certain 2012-2018 Model S vehicles and 2016-2018 Model X vehicles, which used the NVIDIA flash memory devices that failed — after just five to six years on average.
Vice News first reported on the issue in October 2019, prompting the NHTSA to open an investigation in June 2020.
The National Highway Transportation Safety Administration sent a letter to Tesla on Wednesday asking the company to recall around 158,000 vehicles over faulty touchscreen hardware.
The agency said it was “investigating a potential safety-related defect concerning incidents of media control unit (“MCU”) failures” that had resulted in problems with the backup camera, defogging and defrosting settings, Autopilot, and turn signals.
The issue, which stemmed from the MCUs failing after exceeding their storage capacity, impacted certain 2012-2018 Model S and 2016-2018 Tesla Model X vehicles.
The touchscreens on those models are powered by an NVIDIA processor which stores data in an attached “flash memory device.” But those devices have a finite amount of storage capacity, and according to the NHTSA’s investigation, once they filled up – which happened after just 5 to 6 years, on average – they shut down, causing the MCUs to fail and creating other safety issues.
The MCU failures resulted in the rearview/backup camera screen going “black,” an inability to control defogging and defrosting settings, and the loss of some Autopilot alerts and turn signal functionality, which the agency said could “increase the risk of crash.”
The NHTSA said its Office of Defects Investigation had “tentatively concluded that the failure of the media control unit (MCU) constitutes a defect related to motor vehicle safety.” While the letter doesn’t formally require Tesla to order a recall, the automaker must submit additional justification if it decides not to, and the NHTSA can still take further action if it isn’t satisfied with Tesla’s response.
Vice News originally reported on the issue in October 2019, citing a Tesla repair expert who said: “When this burns out, you wake up to a black screen [in the car’s center console.] There’s nothing there. No climate control. You can generally drive the car, but it won’t charge.”
The NHTSA said it opened its own investigation on June 22, 2020.