Within the confines of a teenager’s darkened bedroom (what’s so bad about an open window?), beta testing ‘Age of Empires IV’ could see Genghis Khan and his mongol hordes jeopardise the lives of more than a few Chinese peasant farmers.
But on US roads, beta testing the latest (9.0) version of Tesla’s ‘Full Self-Driving’ (FSD) function can put real, flesh and blood road users and pedestrians in mortal peril, in an experiment none of them consented to be a part of.
Yes, right now around 800 Tesla employees and close to 100 Tesla owners are running FSD 9-enabled cars in the USA (a relatively minor v9.1 update dropped at the end of July), across 37 states (the majority in California), feeding data back to Tesla’s ‘neural networks’, designed to learn from these experiences and help refine its ‘Autopilot’ and FSD systems. A drop in America’s vast automotive ocean, but enough to raise questions.
Autopilot is Tesla’s existing driver-assistance package built around adaptive cruise control, lane-keep assist, auto lane changing and self-parking.
That name’s sparked heated debate, and while I understand that even in the context of a commercial jet, autopilot isn’t the ‘feet on the dash’ hands (and mind) free experience Hollywood has helped make it out to be, perception is everything, and using that name is at best naive, and at worst reckless.
Which makes marketing what is still an SAE Level 2 ‘Advanced Driver Assistance System’ (there are six levels) as Full Self-Driving, even more questionable.
FSD is based almost exclusively on cameras and microphones; Tesla recently rejecting radar and never adopting the widely used ‘Light Detection and Ranging’ (Lidar) remote sensing tech on the basis that it’s not necessary.
In fact, at an early 2019 Tesla Autonomy Day event, CEO Elon Musk said those pursuing Lidar in their quest for autonomous driving were on a “fool’s errand”.
Cynics might say compact cameras are a great way to keep per unit costs down, but even if the approach is cheaper, potential integration of FLIR thermal imaging could strengthen the camera-only approach’s current Achilles Heel… bad weather. Which brings us back to development of the system on public roads.
Sure, the Tesla employees running FSD 9 underwent an in-house quality and testing program, and owners were selected on the basis of their outstanding driving records, but they’re not development engineers, and they aren’t necessarily going to do the right thing all the time.
These cars do not have any specific systems in place to ensure the driver remains alert and attentive. And for the record, Argo AI, Cruise and Waymo test software updates on private closed facilities, with specifically trained drivers monitoring the vehicles.

One of the biggest steps with FSD 9 is the system can now (under driver supervision) navigate intersections and city streets.
Musk has suggested FSD drivers should be “paranoid” in their approach, assuming things could go wrong at any time.
Watching the highly regarded, Detroit-based engineer Sandy Munro take a ride with Chris from ‘Dirty Tesla’ (@DirtyTesla on social media, as well as President of the Tesla Owners Club of Michigan) in the latter’s FSD 9-equipped Model Y is illuminating.
Chris, an unabashed Tesla devotee, confirms “there’s still a lot to be done. It does make mistakes pretty often”.
He adds: “It’s a lot freer than the public build of Autopilot, that’s like, stuck in its lane. If this thinks it needs to move on the centre line to get out of a bicyclist’s way, it’ll do it. You have to be prepared for when it does that, and when it’s not supposed to.”
Chris says at times during the drive the system is not “confident” in what it sees. “Definitely times I take over when it’s getting too close to a wall, too close to some barrels or something like that,” he adds.
Talking to Consumer Reports on the subject of FSD 9 testing, Selika Josiah Talbott, a professor at the American University School of Public Affairs in Washington, DC, who studies autonomous vehicles, said the FSD Beta 9-equipped Teslas in videos she has seen act “almost like a drunk driver,” struggling to stay between lane lines.

“It’s meandering to the left, it’s meandering to the right,” she says. “While its right-hand turns appear to be fairly solid, the left-hand turns are almost wild.”
And it’s not as if these are early stage teething problems. This is tech that’s been ‘nearly ready’ for a long time. Musk famously said FSD would be “feature complete” by the end of 2019. For years Tesla has been charging for something it has over-promised and under-delivered, as in 100 per cent not delivered.
The idea is the Tesla you buy today is FSD-capable, and an over-the-air update will enable the functionality you’ve pre-paid for once it’s ready.
In 2018, FSD was priced at US$3000 point-of-sale (or US$4000 post-purchase). An early 2019 drop to US$2000 surely thrilled those that had already coughed up, but the price steadily rose from there as development continued.
‘Autopilot’ became standard while the FSD option stepped up to $5000, then in mid-2019, as Elon Musk announced full self-driving was “18 months away”, it rose to $6000, then through $7000, $8000, and on to $10,000 late last year.
Couple of things here. According to Chris from Dirty Tesla, the FSD release notes reinforce the point that “you always have to pay attention, keep your hands on the wheel.”
Even the SAE Level 3 standard (which is a huge step, and FSD 9 isn’t L3) says “the driver must remain alert and ready to take control”. Not autonomous. Not full self-driving.
.jpg)
So what’s the point? Tesla owners are testing a software product they’ve already paid for, and were meant to receive yonks ago. And the need for constant supervision surely makes the process more stressful and arguably less safe as the driver second guesses the system’s next move.
In October 2019, Musk tweeted: "Next year for sure, we will have over a million robotaxis on the road. The fleet wakes up with an over-the-air update. That's all it takes."
The logic being there are already a lot of Teslas out there on the road (20 million is a stretch), and via a yet-to-be-released Tesla ride-hailing smartphone app, your investment in FSD unlocks the potential of an appreciating, income-earning, fully autonomous asset.
But in July this year, Musk had changed his tune markedly, tweeting: “Generalised self-driving is a hard problem, as it requires solving a large part of real-world AI. Didn’t expect it to be so hard, but the difficulty is obvious in retrospect. Nothing has more degrees of freedom than reality.”
Maybe that’s a case of better late than never, because no matter how it’s tested, a Level 5 autonomous Tesla, one that in the near future delivers on the ‘full self-driving’ promise, is as likely as a gentle dusting of fresh powder snow on Uluru.
And how long expectant Tesla owners will wait for the FSD they’ve paid for, in some cases years ago, and how satisfied they are when (if?) it finally arrives will be fascinating to watch.