From swerving into a median to narrowly missing poles, videos of Tesla’s latest Full Self-Driving update don’t inspire much confidence

The interior of a Tesla driving down the highway
Tesla interior

  • Tesla launched the latest version of its Full Self-Driving beta after a long delay this month.
  • Beta testers immediately started posting clips of the driver-assistance system.
  • The software is impressive and advanced, but still gets drivers into dangerous situations.
  • See more stories on Insider’s business page.

Earlier this month, Tesla rolled out a long-awaited update to its Full Self-Driving software for beta testers. It’s impressive – but it still doesn’t make cars autonomous.

The electric-vehicle maker first beamed out access to the pre-production tech in October, and it’s now in the hands of a couple thousand loyal Tesla owners. It takes Tesla’s existing driver-assistance system, which is mainly suited for predictable highway driving, and adds the ability to automate driving tasks on more complicated non-highway streets.

Videos of the new-and-improved software in action show that it can impressively navigate some tough driving situations, but there are plenty of dangerous flaws and glitches too.

In one clip, a Tesla confidently handles a tight, unmarked road with an oncoming car. The computer does pretty much exactly what a human would do: slow down and pull over to let the oncoming car go first, then pull forward once it’s clear that the other driver is giving right of way.

Another shows the system navigating stop-and-go traffic:

Another shows that it can see stop signs and make turns on dark – albeit empty – city streets, too. Some videos also show cars stopping for pedestrians and other vehicles.

Read more: Fort Lauderdale asked Elon Musk to build a commuter train tunnel. So how did it end up considering a $30 million beach tunnel for Teslas instead?

But the system still struggles with utterly basic driving tasks, putting drivers and bystanders in dicey situations. In one clip documenting a drive in downtown San Francisco, the car drunkenly swerves into a striped median, forcing the driver to take control.

In the same video, the car stumbles through a left turn and nearly oversteers into a parked car.

In a clip set in Chicago, the car slowly creeps through intersections, comes to random stops, and only notices a road closure at the last second. A bunch of orange barricades is something any average human driver would recognize before actually attempting a turn.

All of these dangerous hiccups show just how far Tesla is from replicating human driving. But one particularly alarming clip out of Seattle takes the cake.

In the nighttime video, the beta fails to recognize the massive concrete columns supporting the city’s monorail – and the car nearly steers into them twice in an attempt to change lanes.

If a highly automated car should be able to do one thing, it’s recognizing large stationary objects and avoid them. But it appears that the car had no idea the pillars were even there, judging by the visualization displayed on the center screen.

The people in the car wonder whether the failure is a result of Tesla shifting to a camera-only system that doesn’t use radar. And that’s certainly a possibility. Car companies, Tesla included, have relied on radar for years for features like emergency braking and cruise control. But Tesla in May decided to stop using the sensors and take them out of its future cars.

Tesla has adopted a fast-and-loose approach to its automated-driving tech that other automakers aren’t taking. And safety advocates have taken issue with Tesla’s strategy of having amateur drivers test unproven technology on public roads. Pedestrians, cyclists, and other drivers didn’t sign up to be subjects in this lab experiment, they argue.

But the company is under mounting pressure to deliver a final version of Full Self-Driving to customers, who have shelled out up to $10,000 over the years for the add-on under the promise that it would eventually enable Teslas to drive themselves. It’s increasingly looking like that’s not happening anytime soon.

Read the original article on Business Insider

It’s not just Tesla. Experts say the entire industry is struggling to safely introduce automated features to the masses.

Ford BlueCruise.
Ford BlueCruise advertises “hands-free driving.” Experts say that gives drivers the wrong idea.

  • Tesla often catches heat for the fast-and-loose way it markets Autopilot.
  • But virtually all automakers are caught up in a messy transition to automated driving.
  • Better education, marketing, and engineering can make roads safer as tech takes over, experts say.
  • See more stories on Insider’s business page.

When a fiery Tesla Model S crash killed two people in April – with nobody behind the wheel, officials said – Elon Musk’s carmaker came under fire once more over the risks of its Autopilot tech. It wasn’t the first time, and it likely won’t be the last.

Over the years, critics have called out Tesla for the misleading and potentially dangerous way it brands its driver-assistance features, Autopilot and Full-Self-Driving Capability, neither of which make Teslas autonomous. And a growing number of high-profile crashes and countless viral videos show that even if Tesla doesn’t endorse reckless driving, its cars certainly allow it to happen.

But Tesla isn’t alone. The messy transition to automated driving is upon us, and the entire auto industry has work to do to make roads safer. Advanced driver-assistance systems aim to make driving safer and more comfortable with features like lane centering, blind-spot monitoring, and adaptive cruise control. Still, their rollout has been far from perfect, experts say.

Plenty of other automakers give drivers an imperfect impression of what their cars can do. General Motors and Ford market their systems as “hands-free driving” on approved stretches of highway. And although the companies warn drivers to keep their eyes on the road at all times, Missy Cummings, a Duke University engineering professor who studies automation, says “hands-free driving” is an invitation for people’s attention to wander.

“The message that you are sending to consumers when you say ‘hands-free’ – even though they don’t mean this – is ‘attention-free,'” Cummings told Insider. “There is a lot of confusion already in the customer’s mind about what cars are capable of. And by endorsing ‘hands-free,’ you are only going to see more distracted behavior.”

“Representatives from Ford and GM pushed back, telling Insider that their vehicles’ internal cameras monitor driver awareness, while various alerts ensure they stay engaged.”

BlueCruise_Mustang_Mach E_02
Ford BlueCruise.

According to a 2018 AAA survey, 40% of drivers expect that systems with names like Autopilot, Hyundai ProPilot, Volvo Pilot Assist should enable cars to drive themselves, despite there being no such car on the market. Today’s most advanced driver-assistance systems can reliably do things like keep up with traffic, maintain a lane, and park automatically. Still, they require full driver supervision in case something goes wrong and they can’t handle more complex driving tasks. Lower-tier systems have safety features like collision detection, lane departure warning, and blind-spot monitoring.

Read more: Why the CEO of a BMW- and Ford-backed battery startup thinks his $1.2 billion SPAC deal will help revolutionize EVs

It’s more than just marketing and naming. A larger problem is that drivers don’t fully understand what their cars can and can’t do – regardless of their features.

It’s a given that practically no owners read the manual, where most detailed information about safety features resides. But research has also shown that many car salespeople don’t grasp driver-assistance tech enough to adequately educate buyers. Some even spread misinformation.

A GM test driver sits in a 2021 Cadillac Escalade SUV with General Motors' Super Cruise hands-free driving assistance in this undated handout picture. General Motors/Handout via REUTERS
2021 Cadillac Escalade SUV with General Motors’ Super Cruise hands-free driving assistance.

Tracy Pearl, a law professor at the University of Oklahoma who researches self-driving technology, says the education gap creates two worrisome trends. More drivers will abuse systems like Autopilot by not paying attention to the road, while others won’t get the full benefits of advanced features because they either don’t trust the technology or don’t know how to use it.

“What we need are people in that middle category who are going to be steely-eyed realists about what exactly the system is capable of and who are willing to learn how to use their cars safely,” Pearl said. Expanding that group will be the key challenge facing automated driving in coming years, she continued.

Car companies also must improve the design of their systems, says Bryant Walker Smith, a professor at the University of South Carolina who coauthored the global standards for driving automation. Especially pressing areas include what happens when a system needs to disengage and how cars monitor driver attention.

tesla autopilot
A Tesla on Autopilot.

GM’s Super Cruise, for example, uses cameras to ensure drivers’ eyes are looking forward. That’s more effective than some other strategies but remains an imperfect way of measuring attentiveness, Smith said. More generally, car companies need to think about balancing safety and convenience features to reduce the risk of crashes without making drivers lazy or degrade their driving skills, he said.

Ultimately, Smith said, the federal government should take more action to better understand driver-assist systems, standardize them, mandate the most life-saving features, and hold automakers to a higher standard. Automated vehicle policy, he says, represents a huge opportunity to make roads safer in a country where some 40,000 people die in motor vehicle crashes annually, and major restrictions on conventional vehicles are unlikely.

Advanced driver-assist systems are a potential game-changer for road safety. But with the way things are going, experts say we could start to see more crashes as the complex technology gets into the hands of more undereducated drivers.

“I think the problem is going to get worse before it gets better,” Pearl said.

Read the original article on Business Insider

Consumer Reports proved that a Tesla will drive with nobody behind the wheel following fatal crash

Tesla Model Y.
Consumer Reports demonstrated it’s not all that difficult to get around Tesla’s safety systems.

  • Consumer Reports showed that it’s possible to turn on Tesla Autopilot with nobody behind the wheel.
  • The firm’s car-testing director was able to sit in the passenger’s seat while the Tesla drove itself.
  • The demonstration comes after a fatal Tesla crash where authorities said nobody was driving.
  • See more stories on Insider’s business page.

Consumer Reports on Thursday proved just how simple it is to fool a Tesla into driving without anybody behind the wheel.

Engineers from the consumer-research organization bypassed Tesla’s safety systems on a closed test track to show that – without too much fuss – the carmaker’s Autopilot driver-assistance technology can be engaged without anybody in the driver’s seat. They posted a video explaining how they did it.

The report comes after authorities said nobody was behind the wheel when a Tesla Model S careened off the road and into a tree in Texas on Saturday, killing its two occupants. Tesla CEO Elon Musk said in a Monday tweet that data logs recovered “so far” show that Autopilot wasn’t enabled at the time of the crash. Local police, the National Highway Traffic Safety Administration, and the National Transportation Safety Board are all investigating the cause of the incident.

Tesla has two methods of ensuring that a driver is paying attention when using Autopilot, an advanced driver-assistance system that keeps a car between lane markings and maintains a set distance to other cars. Both safeguards were easily defeated by Consumer Reports, though the outlet urges that nobody replicate its findings under any circumstances.

Autopilot can only be engaged when the driver’s seatbelt is fastened. Consumer Reports engineers bypassed that by fastening the seatbelt before getting in the car. Autopilot also needs to feel some resistance from a driver’s hand resting on the steering wheel. Consumer Reports achieved that by hanging a small amount of weight from the wheel.

The result was that Jake Fisher, the outlet’s senior director of auto testing, was able to engage Autopilot, bring the car to a stop, climb into the passenger’s seat, and bring the car back up to speed from there using a dial on the steering wheel. The Tesla Model Y followed lane markings on Consumer Reports’ test track and didn’t issue any warning that nobody was behind the wheel.

Read more: Meet the 11 power players of the self-driving industry from leading companies like Tesla, Zoox, and Morgan Stanley

“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” Fisher said in the report. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”

Tesla did not return Insider’s request for comment.

Consumer Reports’ controlled demonstration confirms what has already been displayed in numerous viral videos, like one from November in which a Tesla owner climbs into the back seat and closes his eyes while his car cruises down the highway. In a clip posted in September, a Tesla owner shows it’s possible to climb out of a car’s window with Autopilot engaged.

The outlet said that Tesla is “falling behind” other automakers when it comes to monitoring driver attention while advanced driver-assistance systems are operating. General Motors’ Super Cruise uses internal cameras to make sure a driver is looking at the road, and Ford’s upcoming BlueCruise will do the same.

Are you a Tesla customer or employee with a story to share? Contact this reporter at

Read the original article on Business Insider