Researchers reckon all Teslas need human interventions every 13 miles
Owners of Tesla's Cybertruck are reporting that a software update enabling Full Self Driving (FSD) has become an option for their giant rolling wedges of stainless steel.
A post on the Cybertruck Owners Club Forum Sunday indicated that some lucky Cybertruck owners received an over-the-air software update labeled 2024.32.20, which included an early access build of FSD version 12.5.5. Multiple users reported receiving the update, and videos of purported FSD cruises in the Cybertruck have since appeared.
Unlike all the Tesla models that preceded it, the Cybertruck didn't ship with FSD or any other type of Autopilot technology, though buyers of the $99,990 vehicle were still able to pay for the feature with the promise that Tesla would deliver it in due course.
With the weekend release to early access invitees, the Cybertruck now has access to the newest version of FSD. Most Teslas are still running version 12.5.4, which only recently saw a general release. We've asked Tesla when FSD will be generally available for the Cybertruck, but haven't received a response at the time of writing.
There's plenty to criticize about the Cybertruck which has been found to rust, jam fingers, and have a flaw that can mean its accelerator becomes stuck.
Automotive safety experts have been especially critical of the vehicle's shape and excessive weight, which they've suggested make it unsafe for pedestrians, cyclists, and other motorists.
A recent report about all of Tesla's FSD wares offers another reason to worry about self-driving Cybertrucks.
That report came from automotive research firm AMCI Testing, which last week published research that it claimed is the "most extensive real-world test of Tesla's FSD ever conducted by an independent third party," covering more than 1,000 miles of real-world driving.
While Tesla FSD's performance was "impressive for a uniquely camera-based system ... our drivers had to intervene over 75 times during the evaluation; an average of once every 13 miles," AMCI found.
The researchers described FSD's performance as being surprisingly capable, especially in the first few minutes of a drive.
"The confidence (and often, competence) with which [Tesla FSD] undertakes complex driving tasks lulls users into believing that it is a thinking machine - with its decisions and performance based on a sophisticated assessment of risk (and the user's wellbeing)," AMCI said.
But errors are frequent, the firm warned, and when they occur they're often "sudden, dramatic and dangerous."
"In those circumstances, it is unlikely that a driver without their hands on the wheel will be able to intervene in time to prevent an accident - or possibly a fatality," AMCI found.
The automaker has promised to reveal an autonomous robotaxi at an investor day scheduled for October 10 - after previous postponements.
AMCI observed that its findings lead it to suspect Tesla's autonomous driving capabilities may not be up to the task of safely operating a fleet of self-driving taxis, however.
"Getting close to foolproof, yet falling short, creates an insidious and unsafe operator complacency issue as proven in the test results," argued AMCI Global CEO David Stokols. "Although [Tesla FSD] positively impresses in some circumstances, you simply cannot reliably rely on the accuracy or reasoning behind its responses." ®