Tesla wants owners to pay $1,500 for hardware they thought they already had

A white Tesla Model S is pictured at a Tesla facility in Littleton, Colorado.
One long-time Tesla owner felt like the company is “screwing over its earliest supporters.”

  • Tesla launched a subscription option for its Full Self-Driving driver-assistance feature.
  • Tesla said in 2016 that all cars moving forward would have the hardware for Full Self-Driving.
  • But some owners are being asked to pay $1,500 to upgrade their cars’ computers.
  • See more stories on Insider’s business page.

Tesla launched a long-awaited subscription option for its Full Self-Driving feature this month, slashing the system’s cost from a hefty $10,000 one-time payment to $199 per month.

The subscription, which Elon Musk had been promising for months, is huge for some Tesla owners as it allows them to test out Full Self-Driving without blowing the college fund. Moreover, buyers can cancel at any time if they don’t think the feature – which automates some driving tasks but doesn’t make cars autonomous – is worth the cost.

But there’s an expensive catch that’s peeving some long-time Tesla owners.

People who bought their Tesla before the middle of 2019 need to fork over $1,500 to upgrade their vehicle’s computer if they want to subscribe. That directly contradicts the automaker’s 2016 announcement that all vehicles built from that point on would come equipped with the hardware to run Full Self-Driving.

“It feels like Tesla is screwing over its earliest supporters,” Arjun, who bought a Model 3 when it launched in 2018 and described himself as a “long-time Tesla supporter, stockholder, and fan,” told Insider. “We are not asking for a quick buck or a discount, we are just asking for the hardware we were told came preinstalled on our vehicles.”

Arjun, who asked that Insider not use his full name, said it was a “huge selling point” that his Model 3 came with Full Self-Driving hardware, given that he intended to purchase the software at some point down the line. He called Tesla’s move a “blatant bait and switch.”

Read more: Read the $1 million-penalty contract Elon Musk’s tunnel company used to keep its tech secret

Other Tesla owners in similar situations aired their grievances on social media.

“Would be nice to get a cheaper installation or something, considering we were helping them when dying was a real possibility for Tesla. However, I expect nothing,” one Reddit user said.

“I have a 2018 M3 and I’m furious,” another person said, referring to the Model 3 sedan. “I’d love to see some legal action taken.”

Some also lamented that in order to subscribe they have to buy Autopilot, a the company’s driver-assistance system that used to cost $3,000 but became standard on all Teslas in 2019.

Tesla didn’t return a request for comment.

When Tesla launched its latest-generation Full Self-Driving computer in 2019, it began offering free upgrades to owners who had paid full fare for the feature. But those who held out for the subscription are on their own.

Tesla has a shaky history of not delivering on its automated-driving tech. Musk has been saying since at least 2016 that full autonomy is right around the corner. He promised that by 2020, owners would be able to generate passive income by turning their cars into robotaxis.

But Full Self-Driving is still far from living up to its branding. The most advanced beta version of the software – currently in the hands of a couple thousand Tesla owners – requires full driver attention and still has major flaws.

Arjun says Tesla should cover the hardware upgrade costs for vehicles that Tesla claimed had the right computers to begin with.

“I love my Model 3 and continue to believe in the tech behind Tesla, but they’re no longer a start up and they have to behave like a company who stands behind their word,” he said. “I believe Tesla should waive the fee/absorb the cost, and then I’d be happy to try out the service. Until then, I’ll stay on my obsolete hardware.”

Read the original article on Business Insider

From swerving into a median to narrowly missing poles, videos of Tesla’s latest Full Self-Driving update don’t inspire much confidence

The interior of a Tesla driving down the highway
Tesla interior

  • Tesla launched the latest version of its Full Self-Driving beta after a long delay this month.
  • Beta testers immediately started posting clips of the driver-assistance system.
  • The software is impressive and advanced, but still gets drivers into dangerous situations.
  • See more stories on Insider’s business page.

Earlier this month, Tesla rolled out a long-awaited update to its Full Self-Driving software for beta testers. It’s impressive – but it still doesn’t make cars autonomous.

The electric-vehicle maker first beamed out access to the pre-production tech in October, and it’s now in the hands of a couple thousand loyal Tesla owners. It takes Tesla’s existing driver-assistance system, which is mainly suited for predictable highway driving, and adds the ability to automate driving tasks on more complicated non-highway streets.

Videos of the new-and-improved software in action show that it can impressively navigate some tough driving situations, but there are plenty of dangerous flaws and glitches too.

In one clip, a Tesla confidently handles a tight, unmarked road with an oncoming car. The computer does pretty much exactly what a human would do: slow down and pull over to let the oncoming car go first, then pull forward once it’s clear that the other driver is giving right of way.

Another shows the system navigating stop-and-go traffic:

Another shows that it can see stop signs and make turns on dark – albeit empty – city streets, too. Some videos also show cars stopping for pedestrians and other vehicles.

Read more: Fort Lauderdale asked Elon Musk to build a commuter train tunnel. So how did it end up considering a $30 million beach tunnel for Teslas instead?

But the system still struggles with utterly basic driving tasks, putting drivers and bystanders in dicey situations. In one clip documenting a drive in downtown San Francisco, the car drunkenly swerves into a striped median, forcing the driver to take control.

In the same video, the car stumbles through a left turn and nearly oversteers into a parked car.

In a clip set in Chicago, the car slowly creeps through intersections, comes to random stops, and only notices a road closure at the last second. A bunch of orange barricades is something any average human driver would recognize before actually attempting a turn.

All of these dangerous hiccups show just how far Tesla is from replicating human driving. But one particularly alarming clip out of Seattle takes the cake.

In the nighttime video, the beta fails to recognize the massive concrete columns supporting the city’s monorail – and the car nearly steers into them twice in an attempt to change lanes.

If a highly automated car should be able to do one thing, it’s recognizing large stationary objects and avoid them. But it appears that the car had no idea the pillars were even there, judging by the visualization displayed on the center screen.

The people in the car wonder whether the failure is a result of Tesla shifting to a camera-only system that doesn’t use radar. And that’s certainly a possibility. Car companies, Tesla included, have relied on radar for years for features like emergency braking and cruise control. But Tesla in May decided to stop using the sensors and take them out of its future cars.

Tesla has adopted a fast-and-loose approach to its automated-driving tech that other automakers aren’t taking. And safety advocates have taken issue with Tesla’s strategy of having amateur drivers test unproven technology on public roads. Pedestrians, cyclists, and other drivers didn’t sign up to be subjects in this lab experiment, they argue.

But the company is under mounting pressure to deliver a final version of Full Self-Driving to customers, who have shelled out up to $10,000 over the years for the add-on under the promise that it would eventually enable Teslas to drive themselves. It’s increasingly looking like that’s not happening anytime soon.

Read the original article on Business Insider

It’s not just Tesla. Experts say the entire industry is struggling to safely introduce automated features to the masses.

Ford BlueCruise.
Ford BlueCruise advertises “hands-free driving.” Experts say that gives drivers the wrong idea.

  • Tesla often catches heat for the fast-and-loose way it markets Autopilot.
  • But virtually all automakers are caught up in a messy transition to automated driving.
  • Better education, marketing, and engineering can make roads safer as tech takes over, experts say.
  • See more stories on Insider’s business page.

When a fiery Tesla Model S crash killed two people in April – with nobody behind the wheel, officials said – Elon Musk’s carmaker came under fire once more over the risks of its Autopilot tech. It wasn’t the first time, and it likely won’t be the last.

Over the years, critics have called out Tesla for the misleading and potentially dangerous way it brands its driver-assistance features, Autopilot and Full-Self-Driving Capability, neither of which make Teslas autonomous. And a growing number of high-profile crashes and countless viral videos show that even if Tesla doesn’t endorse reckless driving, its cars certainly allow it to happen.

But Tesla isn’t alone. The messy transition to automated driving is upon us, and the entire auto industry has work to do to make roads safer. Advanced driver-assistance systems aim to make driving safer and more comfortable with features like lane centering, blind-spot monitoring, and adaptive cruise control. Still, their rollout has been far from perfect, experts say.

Plenty of other automakers give drivers an imperfect impression of what their cars can do. General Motors and Ford market their systems as “hands-free driving” on approved stretches of highway. And although the companies warn drivers to keep their eyes on the road at all times, Missy Cummings, a Duke University engineering professor who studies automation, says “hands-free driving” is an invitation for people’s attention to wander.

“The message that you are sending to consumers when you say ‘hands-free’ – even though they don’t mean this – is ‘attention-free,'” Cummings told Insider. “There is a lot of confusion already in the customer’s mind about what cars are capable of. And by endorsing ‘hands-free,’ you are only going to see more distracted behavior.”

“Representatives from Ford and GM pushed back, telling Insider that their vehicles’ internal cameras monitor driver awareness, while various alerts ensure they stay engaged.”

BlueCruise_Mustang_Mach E_02
Ford BlueCruise.

According to a 2018 AAA survey, 40% of drivers expect that systems with names like Autopilot, Hyundai ProPilot, Volvo Pilot Assist should enable cars to drive themselves, despite there being no such car on the market. Today’s most advanced driver-assistance systems can reliably do things like keep up with traffic, maintain a lane, and park automatically. Still, they require full driver supervision in case something goes wrong and they can’t handle more complex driving tasks. Lower-tier systems have safety features like collision detection, lane departure warning, and blind-spot monitoring.

Read more: Why the CEO of a BMW- and Ford-backed battery startup thinks his $1.2 billion SPAC deal will help revolutionize EVs

It’s more than just marketing and naming. A larger problem is that drivers don’t fully understand what their cars can and can’t do – regardless of their features.

It’s a given that practically no owners read the manual, where most detailed information about safety features resides. But research has also shown that many car salespeople don’t grasp driver-assistance tech enough to adequately educate buyers. Some even spread misinformation.

A GM test driver sits in a 2021 Cadillac Escalade SUV with General Motors' Super Cruise hands-free driving assistance in this undated handout picture. General Motors/Handout via REUTERS
2021 Cadillac Escalade SUV with General Motors’ Super Cruise hands-free driving assistance.

Tracy Pearl, a law professor at the University of Oklahoma who researches self-driving technology, says the education gap creates two worrisome trends. More drivers will abuse systems like Autopilot by not paying attention to the road, while others won’t get the full benefits of advanced features because they either don’t trust the technology or don’t know how to use it.

“What we need are people in that middle category who are going to be steely-eyed realists about what exactly the system is capable of and who are willing to learn how to use their cars safely,” Pearl said. Expanding that group will be the key challenge facing automated driving in coming years, she continued.

Car companies also must improve the design of their systems, says Bryant Walker Smith, a professor at the University of South Carolina who coauthored the global standards for driving automation. Especially pressing areas include what happens when a system needs to disengage and how cars monitor driver attention.

tesla autopilot
A Tesla on Autopilot.

GM’s Super Cruise, for example, uses cameras to ensure drivers’ eyes are looking forward. That’s more effective than some other strategies but remains an imperfect way of measuring attentiveness, Smith said. More generally, car companies need to think about balancing safety and convenience features to reduce the risk of crashes without making drivers lazy or degrade their driving skills, he said.

Ultimately, Smith said, the federal government should take more action to better understand driver-assist systems, standardize them, mandate the most life-saving features, and hold automakers to a higher standard. Automated vehicle policy, he says, represents a huge opportunity to make roads safer in a country where some 40,000 people die in motor vehicle crashes annually, and major restrictions on conventional vehicles are unlikely.

Advanced driver-assist systems are a potential game-changer for road safety. But with the way things are going, experts say we could start to see more crashes as the complex technology gets into the hands of more undereducated drivers.

“I think the problem is going to get worse before it gets better,” Pearl said.

Read the original article on Business Insider

Tesla will now monitor drivers via in-car cameras to make sure they’re paying attention when Autopilot is on

EM   Photo by Christophe Gateau:picture alliance via Getty Images
Tesla CEO Elon Musk.

  • Tesla cars will now monitor drivers who use Autopilot via in-car cameras.
  • The cameras, above the rearview mirror, will check that drivers are paying attention while using Autopilot.
  • Previously, Tesla used sensors in the steering wheel to check drivers were paying attention.
  • See more stories on Insider’s business page.

Tesla cars will now monitor drivers who use Autopilot through in-car cameras, TechCrunch reports.

Tesla will activate the cameras, located above the rearview mirror, in Model 3 and Y cars to check that drivers are paying attention to the road while using Autopilot driver assist, it said in a message to drivers.

Until now, Tesla cars relied on steering-wheel sensors that detected whether drivers were holding on, but many drivers have shared their tricks to fool the sensors and go hands-free.

“The cabin camera above your rearview mirror can now detect and alert driver inattentiveness while Autopilot is engaged. Camera data does not leave the car itself, which means the system cannot save or transmit information unless data sharing is enabled,” Tesla said in a release note to its drivers, which was shared by one Tesla owner on Twitter.

Another Twitter user shared a photo of the same update for their vehicle.

Last year, Tesla activated its cabin-facing cameras installed in its Model 3 and Y vehicles in a software update. The camera, if approved by the driver, would “help engineers develop safety features and enhancements in the future,” Tesla said in its release notes at the time.

Tesla CEO Elon Musk had previously rejected using cameras and infrared sensors to track drivers’ eye movements, saying that eye-tracking functions were ineffective.

Tesla has faced criticism over the safety of its self-driving features. The National Highway Traffic Safety Administration (NHTSA) has opened at least 27 investigations into Tesla car crashes, and Autopilot was involved in at least three fatal crashes since 2016, Reuters reported.

Tesla did not immediately respond for comment.

Read the original article on Business Insider

Watch police pull over a Tesla driver they say was asleep at the wheel going 82 mph with Autopilot switched on

GettyImages 493893234
Elon Musk, Tesla CEO.

  • A Tesla operating under Autopilot was on Sunday stopped by police, officials said Tuesday.
  • Police claim the Tesla driver was asleep at the wheel while driving 82 mph.
  • The driver told the deputy that he was tired, but denied being asleep.
  • See more stories on Insider’s business page.

Police on Sunday pulled over a man from Illinois who they say fell asleep behind the wheel of his Tesla, which officers say was operating under Autopilot.

A Kenosha County deputy said he saw the 38-year-old man with his head down and “not looking at the road,” according to the sheriff’s department statement on Facebook on Tuesday. They said it appeared that the man was sleeping.

The man was driving a 2019 Tesla on Interstate 94. The car was operating under Autopilot, the company’s driver assistance feature, police said.

The deputy switched on his vehicle’s lights and siren as he followed the Tesla for around two miles at 82 mph through Kenosha County, according to the statement. The deputy was trying to pull the driver over, but the driver didn’t initially notice, the statement said.

Eventually, the driver noticed he was being stopped when the deputy drove alongside him, and he pulled over.

After the police pulled him over, the driver told them he was “a little bit tired” but denied being asleep.

The deputy told him: “I understand you have Autopilot … but you’re not able to make that conscious decision to stop in a hurry.”

The man was issued a traffic citation for inattentive driving, the sheriff’s department said.

In an interview with Fox 6 News on Wednesday, the deputy said he spotted on the Tesla’s front screen that Autopilot was engaged. He also said that officials attempted to pull over the same vehicle twice in February and once in August last year – two of these reports involved claims the driver was asleep, he added.

“Never let technology take over so that you take your hands and eyes off the road,” the deputy said in the interview.

Tesla’s website states that the current Autopilot system doesn’t make the vehicle autonomous, and that the driver should be active at the wheel. Experts told Insider that although Tesla says its cars on Autopilot are less likely to crash than average vehicles, there are major concerns about the safety of the feature.

Read the original article on Business Insider

A man arrested for riding in the backseat of his driverless Tesla got out of jail, bought a new one, and did it again

California Highway Patrol pulls over a Tesla.
California Highway Patrol tows away Param Sharma’s Tesla.

  • A San Francisco man said he’ll keep riding in the back seat of his Tesla after getting arrested for it, KTVU reports.
  • Param Sharma said he is “very rich” and will keep buying Teslas as his cars are impounded.
  • Tesla sells a feature called Full Self-Driving Capability, but it doesn’t make cars autonomous.
  • See more stories on Insider’s business page.

A San Francisco man who was arrested for riding in the back seat of his Tesla as it drove on the highway says he’ll keep pulling the stunt after being released from jail – and he’ll keep buying more cars as they get impounded.

After getting booked on two counts of reckless driving, Param Sharma arrived for a Wednesday interview with Bay Area news station KTVU riding in the back seat of a Tesla again. But it wasn’t the same car that California Highway Patrol pulled him over in.

The day after being released from jail on Tuesday, he told the channel, Sharma bought a new Tesla Model 3 because his other was impounded. Also, he is “very rich,” he told KTVU.

“I have unlimited money to blow on Teslas. If you take away my Tesla, I will get another Tesla. That’s how it works,” Sharma said.

The California Highway Patrol said Tuesday it had arrested Sharma for reckless driving and disobeying a peace officer. The arrest came after videos circulated online of Sharma riding down the highway in the back seat. He had been cited for the same offense in April, police said.

The incident is just the latest to spark scrutiny around how some Tesla drivers abuse the company’s driver-assistance features. The National Highway Traffic Safety Administration has opened investigations into more than two dozen Tesla crashes, including a fatal incident in April that police said occurred with nobody in the driver’s seat.

Sharma told KTVU he bought a Tesla with the Full Self-Driving package, but seemed overly confident in the feature’s abilities. The $10,000 advanced driver-assistance system – a step up from the standard Autopilot feature – enables a car to automatically change lanes, navigate highway on-ramps and exits, and recognize stop signs and traffic lights.

But it does not make Teslas autonomous, and the company says drivers need to pay full attention when using it.

“It’s like a living room back here. I’m relaxing in luxury while Elon Musk chauffeurs me,” he told KTVU.

Even in its most advanced iteration, the Full Self-Driving system has major flaws. Tesla tells the software’s beta testers to be vigilant, as the feature may “do the wrong thing at the worst time.” In tests, Consumer Reports said Full Self Driving performed inconsistently and sometimes disengaged without warning.

Still, Sharma said he has no plans to stop riding in the back seat of his car, despite the clear dangers the stunt poses to pedestrians and other drivers.

“I feel like by mid-2022 the backseat thing will be normal. And I think right now people are just taking it out of proportion,” he said.

Read the original article on Business Insider

Autopilot couldn’t have been engaged during fatal Tesla crash, NTSB says

Texas Tesla Crash.
The remains of a Tesla vehicle are seen after it crashed, killing two people, in The Woodlands, Texas, on April 17, 2021.

  • The NTSB on Monday released a preliminary investigation into a fatal Tesla crash in April.
  • The agency said its tests indicate Autopilot cannot be used on the road where the crash happened.
  • Local police initially said there was no driver at the wheel when the accident occurred.
  • See more stories on Insider’s business page.

The National Transportation Safety Board on Monday released preliminary findings from its investigation into a fatal Tesla crash in April, offering new details about the incident but leaving key questions unanswered.

The agency said security camera footage from the owner’s Houston-area residence showed him entering the driver’s seat of the 2019 Model S. The passenger entered the front passenger’s seat. Video then showed the car traveling roughly 550 feet before skipping over the curb and crashing into a tree.

Autosteer, a key feature of Tesla’s Autopilot driver-assistance system could not have been enabled on the stretch of road where the incident occurred, the agency said, confirming Tesla CEO Elon Musk’s assertions in the wake of the crash.

In tests, the NTSB said it was able to switch on Traffic-Aware Cruise Control – which can maintain speed and distance from other cars – but not Autosteer, the automatic steering feature that makes up the rest of Autopilot’s main capabilities.

The Texas incident attracted a large of media attention when local police said they believed nobody was driving the sedan when it barreled off the road and burst into flames. First responders found the victims’ bodies in the front passenger’s seat and the back seat of the car, according to police and the Harris County Fire Marshal’s Office.

But Tesla has disputed that initial characterization of the events. Musk tweeted that the car did not have Autopilot engaged and that the system can’t function on the street where the crash occurred because it does not have lane markings. A Tesla executive said on an April conference call that the company believes someone was driving the car when it crashed.

The NTSB’s investigation is still ongoing, and the agency said its report may be supplemented or corrected over time.

The crash and subsequent fire destroyed the car’s onboard storage device, the NTSB said. However, the car’s restraint-control module, which logs data about vehicle speed, acceleration, seatbelts, and airbags, was recovered and is being evaluated by investigators.

The National Highway Traffic Safety Administration also launched a probe into the crash but has not yet released any findings.

Tesla, which is cooperating with the investigation, did not immediately return a request for comment.

Read the original article on Business Insider

Lawmakers demand answers in fatal Tesla crash after Elon Musk and executives offer conflicting details

Texas Tesla Crash.
The remains of a Tesla vehicle are seen after it crashed, killing two people, in The Woodlands, Texas, on April 17, 2021.

  • Lawmakers demanded answers Wednesday about a fatal Tesla crash after executives gave conflicting statements.
  • Elon Musk said autopilot wasn’t on, but a top Tesla exec said adaptive cruise control, an autopilot feature, was.
  • Rep. Kevin Brady and Sen. Richard Blumenthal criticized Tesla’s public statements about the crash.
  • See more stories on Insider’s business page.

Lawmakers slammed Tesla’s public response to a deadly crash involving one of its Model S vehicles that killed two men near Houston, Texas, earlier this month following conflicting statements from the company’s executives.

“Despite early claims by #Tesla #ElonMusk, autopilot WAS engaged in tragic crash in The Woodlands. We need answers,” Rep. Kevin Brady, a Texas Republican, tweeted Wednesday.

Earlier on Wednesday, Sen. Richard Blumenthal, a Democrat from Connecticut, said he was “disappointed” that Musk weighed in publicly at all, given that two federal agencies still have ongoing investigations into the incident.

Tesla did not respond to a request for comment.

Local authorities said following the crash that neither of the bodies they recovered were in the driver’s seat, prompting questions about whether the vehicle’s “autopilot” system – a suite of AI-powered driver assistance features – was engaged when the vehicle crashed.

Two days after the crash, Tesla CEO Elon Musk tweeted that early data obtained from the Model S showed “autopilot was not enabled,” and he doubled down on those claims in Tesla’s earnings call Monday, contradicting local authorities.

But in that same call, Tesla vice president of vehicle engineering Lars Moravy said that the vehicle’s traffic-aware, or adaptive, cruise control – part of the autopilot system, according to Tesla’s Model S owner manual – was engaged during the crash.

“Our adaptive cruise control only engaged when the driver was buckled in above 5 miles per hour. And it only accelerated to 30 miles per hour with the distance before the car crashed,” Moravy said, adding that the feature also “disengaged the car slowly to complete to a stop when the driver’s seatbelt was unbuckled.”

Moravy also pushed back on Texas authorities’ statements that no one was driving the car when it crashed.

“Through further investigation of the vehicle and accident remains, we inspected the car with NTSB and NHTSA and the local police and were able to find that the steering wheel was indeed deformed,” he said, “leading to a likelihood that someone was in the driver’s seat at the time of the crash and all seatbelts post crash were found to be unbuckled.”

Despite misleading and unverified claims about the autopilot’s capabilities and possible safety advantages, the feature doesn’t make Tesla vehicles fully autonomous. At least three drivers have died while using Tesla’s Autopilot, and the National Transportation Safety Board has called for increased scrutiny of self-driving software.

Read the original article on Business Insider

Tesla said it’s likely somebody was in the driver’s seat during a deadly Model S crash in Texas, contradicting local law enforcement

Tesla CEO Elon Musk.
Tesla CEO Elon Musk.

  • Elon Musk again denied that the Tesla that crashed in Texas on April 17, killing two people, was on Autopilot.
  • A Tesla exec added it was likely that someone was in the driver’s seat at the time of the crash.
  • This contradicts statements made by local law enforcement.
  • See more stories on Insider’s business page.

Tesla CEO Elon Musk said on Monday that the Model S that crashed just outside Houston, Texas, earlier this month, killing two people, wasn’t on Autopilot – and that any suggestion otherwise was “completely false.”

Lars Moravy, Tesla’s vice president of vehicle engineering, added that he thought it was likely someone was in the driver’s seat at the time of the deadly crash, contradicting local law enforcement.

On April 17, a Tesla Model S skipped over a curb, crashed into a tree, and burst into flames, killing two people.

A Harris County constable told local TV station KHOU on April 18 that investigators were “100% certain that no one was in the driver seat driving that vehicle at the time of impact.” A senior Harris County officer said on April 19 that witnesses had suggested nobody was driving the vehicle earlier in its journey.

Tesla’s electric vehicles come with Autopilot, a feature that allows the cars to brake, accelerate, and steer automatically. Tesla tells drivers using Autopilot to remain in the driver’s seat with their hands on the steering wheel – but earlier this month, Consumer Reports showed it was possible to turn on Autopilot with nobody in the driver’s seat.

Musk previously said that Autopilot was not being used at the time of the crash. Two days after the crash, he tweeted: “Data logs recovered so far show Autopilot was not enabled.”

Read more: The electric car boom is coming to wipe out auto dealer profits. Consolidating into ‘super dealers’ may be their only way to survive.

During Tesla’s earnings call Monday, Musk said that “there were really just extremely deceptive media practices where it was claimed to be Autopilot but this is completely false.” He didn’t reference any specific media reports.

Moravy said that Tesla had been working with local authorities, the National Highway Traffic Safety Administration (NHTSA), and the National Transportation Safety Board (NTSB) to investigate the crash.

“The steering wheel was indeed deformed so we’re leaning to the likelihood that someone was in the driver’s seat at the time of the crash,” Moravy said.

“All seatbelts post-crash were found to be unbuckled,” he added. Tesla’s Autopilot only works when seatbelts are buckled in.

Moravy said that Tesla was unable to recover the data from the vehicle’s SD card at the time of impact, but that the local authorities were working on that.

“We continue to hold safety in a higher regard and look to improve products in the future through this kind of data and other information from the field,” he added.

Tesla also sells its full self-driving software (FSD) as a $10,000 one-off add-on, which it plans to release widely in 2021. FSD allows cars to park themselves, change lanes, and identify both stop signs and traffic lights.

Neither Autopilot nor FSD makes a Tesla car fully autonomous.

At least three drivers have died while using Tesla’s Autopilot, and the National Transportation Safety Board has called for increased scrutiny of self-driving software.

Read the original article on Business Insider

Consumer Reports proved that a Tesla will drive with nobody behind the wheel following fatal crash

Tesla Model Y.
Consumer Reports demonstrated it’s not all that difficult to get around Tesla’s safety systems.

  • Consumer Reports showed that it’s possible to turn on Tesla Autopilot with nobody behind the wheel.
  • The firm’s car-testing director was able to sit in the passenger’s seat while the Tesla drove itself.
  • The demonstration comes after a fatal Tesla crash where authorities said nobody was driving.
  • See more stories on Insider’s business page.

Consumer Reports on Thursday proved just how simple it is to fool a Tesla into driving without anybody behind the wheel.

Engineers from the consumer-research organization bypassed Tesla’s safety systems on a closed test track to show that – without too much fuss – the carmaker’s Autopilot driver-assistance technology can be engaged without anybody in the driver’s seat. They posted a video explaining how they did it.

The report comes after authorities said nobody was behind the wheel when a Tesla Model S careened off the road and into a tree in Texas on Saturday, killing its two occupants. Tesla CEO Elon Musk said in a Monday tweet that data logs recovered “so far” show that Autopilot wasn’t enabled at the time of the crash. Local police, the National Highway Traffic Safety Administration, and the National Transportation Safety Board are all investigating the cause of the incident.

Tesla has two methods of ensuring that a driver is paying attention when using Autopilot, an advanced driver-assistance system that keeps a car between lane markings and maintains a set distance to other cars. Both safeguards were easily defeated by Consumer Reports, though the outlet urges that nobody replicate its findings under any circumstances.

Autopilot can only be engaged when the driver’s seatbelt is fastened. Consumer Reports engineers bypassed that by fastening the seatbelt before getting in the car. Autopilot also needs to feel some resistance from a driver’s hand resting on the steering wheel. Consumer Reports achieved that by hanging a small amount of weight from the wheel.

The result was that Jake Fisher, the outlet’s senior director of auto testing, was able to engage Autopilot, bring the car to a stop, climb into the passenger’s seat, and bring the car back up to speed from there using a dial on the steering wheel. The Tesla Model Y followed lane markings on Consumer Reports’ test track and didn’t issue any warning that nobody was behind the wheel.

Read more: Meet the 11 power players of the self-driving industry from leading companies like Tesla, Zoox, and Morgan Stanley

“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” Fisher said in the report. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”

Tesla did not return Insider’s request for comment.

Consumer Reports’ controlled demonstration confirms what has already been displayed in numerous viral videos, like one from November in which a Tesla owner climbs into the back seat and closes his eyes while his car cruises down the highway. In a clip posted in September, a Tesla owner shows it’s possible to climb out of a car’s window with Autopilot engaged.

The outlet said that Tesla is “falling behind” other automakers when it comes to monitoring driver attention while advanced driver-assistance systems are operating. General Motors’ Super Cruise uses internal cameras to make sure a driver is looking at the road, and Ford’s upcoming BlueCruise will do the same.

Are you a Tesla customer or employee with a story to share? Contact this reporter at tlevin@insider.com.

Read the original article on Business Insider