By Joann Muller
October 23, 2020
Tesla is beta-testing its latest self-driving technology with a small group of early adopters, a move that alarms experts and makes every road user — including other motorists, pedestrians and cyclists — unwitting subjects in its ongoing safety experiment.
Why it matters: Tesla hailed the limited rollout of its “full self-driving” beta software as a key milestone, but the warnings on the car’s touchscreen underscore the risk in using its own customers — rather than trained safety drivers — to validate the technology.
- “It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road,” Tesla warns customers. “Do not become complacent.”
- “Be prepared to act immediately, especially around blind corners, crossing intersections, and in narrow driving situations.”
Reality check: Boeing’s 737 Max did the wrong thing at the worst time, resulting in two plane crashes and 346 people dead. So did Takata’s airbags and GM’s ignition switches, also with deadly consequences.
- Yet so far, regulators at the National Highway Traffic Safety Administration are taking a wait-and-see approach.
- “NHTSA has been briefed on Tesla’s new feature, which represents an expansion of its existing driver assistance system. The agency will monitor the new technology closely and will not hesitate to take action to protect the public against unreasonable risks to safety,” according to a NHTSA statement.
Be smart: Despite its name, Tesla’s “full self-driving” software isn’t fully autonomous, and — as its own warnings confirm — the car is not capable of driving itself.
- Yet in videos posted this week by some of those enthusiastic beta-testers their hands weren’t always on the wheel.
Details: The beta software was sent to a small number of “expert and careful drivers,” according to CEO Elon Musk, though it’s not clear how many car owners received access.
- In a tweet Tuesday night, Musk said the rollout would be “extremely slow and cautious, as it should.” The plan is to roll it out gradually with “wide release” by the end of this year.
- Tesla didn’t respond to a request for more information.
Tesla’s approach set off alarm bells among safety advocates, policymakers and even AV competitors.
- “Using untrained consumers to validate beta-level software on public roads is dangerous,” said Partners for Automated Vehicle Education, a coalition of AV companies, nonprofits and academics.
- Tesla’s “deceptive” use of the term “full self-driving” to describe driver-assistance technology will likely lead to more crashes and deaths, warned Jason Levine, executive director of the Center for Auto Safety.
Threat level: Despite warnings to be attentive, drivers don’t always stay engaged. In February, the National Transportation Safety Board blamed Tesla Autopilot and a driver who relied too heavily on it for a fatal crash in California.
- Even specially trained safety drivers get distracted. In September, a safety driver behind the wheel of an Uber autonomous test vehicle that struck and killed a pedestrian was charged with negligent homicide.
How it works: Autopilot enables your Tesla to steer, accelerate and brake automatically within its lane — assisted-driving features that are available on many cars today.
- Full Self-Driving is an $8,000 add-on package that introduces additional functions over time.
- Things like highway lane-changes and automatic parking retrieval have already been added.
- The new beta release, dubbed “Autosteer for City Streets,” introduces features like non-highway lane changes, navigation around other vehicles, and left and right turns in traffic.
- Along with the new functionality comes a price hike: Musk tweeted Thursday that the full self-driving option will increase to $10,000 starting next week.
The bottom line: Tesla has plotted its own path to autonomy and anyone sharing the road is strapped in for the ride.