The Literal Fatal Loophole Tesla Is Exploiting
Yet more damning evidence against Musk's AI has come to light.
Teslas are pretty damn cool. At least, that is the air that surrounds them. Having test-driven them, I get why. The range, the speed, the fast and accessible charging, and the ability of the car to drive itself is mind-boggling! But, for me, something has seriously mired any positive perception of Tesla. You see, there has been a worrying trend over the past few years with Tesla. One of low corporate responsibility and endangering both Tesla customers and the general public, all to fuel greed, not the betterment of humanity as many would think. If you have read my articles for a while now, you probably know that I am talking about how Musk has managed Tesla’s self-driving push and many of the lawsuits currently mounting against them over it. But, a recent report by the Washington Post has revealed even more morally bankrupt goings-on with Musk’s self-driving obsession.
So, what did the Post find?
It has found eight Tesla crashes that occurred while Autopilot was engaged, many of which were fatal, were on roads where Autopilot should not be enabled.
That sounds damn confusing, so let me explain.
Let’s go through one of these crashes. In 2019, a Tesla driver happily drove along at 70 mph on Autopilot when he dropped his phone and took his eyes off the road. The vehicle failed to stop at a T-junction, crashing through it at full blast, and hit a parked pickup truck with two people inside. The Tesla driver survived, but one of the pickup passengers died, and the other was gravely injured. The Post obtained dash-cam footage of the incident from the Tesla, which shows the car blowing through a stop sign, a blinking light and five yellow signs warning that the road ends and drivers must turn left or right.
** Quick interruption, if you want more from me, or to interact with me, go follow me on Bluesky, X, or Substack**
Similar crashes, again many fatal, happened as far back as 2016 and as recently as this March, including when a Tesla went under a semi-truck and another when a Tesla failed to slow down and hit a teenager stepping off a school bus at 45 mph.
All of these crashes had something in common. Autopilot should not have been in use.
You see, as the Post and even DoJ probes have pointed out that in user manuals, legal documents and communications with federal regulators, Tesla has stated that Autopilot (specifically the autosteer feature) is “intended for use on controlled-access highways” with “a centre divider, clear lane markings, and no cross traffic.” Yet, in these eight crashes, this wasn’t the case. The roads were rural, non-highways or had cross traffic. Tesla has even advised its customers, though not outwardly, that Autopilot can fail on hills or even sharp turns. Yet, as I found out during a recent test drive, you can easily use Autopilot in all of these situations.
This is despite the fact that Tesla has the ability to restrict Autopilot to roads where it is designed for. Such a change would be incredibly easy for Tesla to make, yet despite the deaths and crashes, you can still use Autopilot almost anywhere.
You might be asking why this isn’t illegal. I certainly did. Well, it has to do with how our transport legislation isn’t set up to legislate self-driving cars.
Take the 2016 crash in the Post’s findings that killed the Tesla driver Joshua Brown. The National Transportation Safety Board (NTSB) rightly called for limits to be placed on where driver-assistance technology could be used to avoid crashes like this. However, the NTSB has no legislative ability and is more of an advisory body. The National Highway Traffic Safety Administration (NHTSA), on the other hand, does have legislative power. Yet, it has staunchly ignored the NTSB’s advice. However, it is finally starting to act (more on that in a minute).
There is an argument for corporate negligence and manslaughter by both Tesla and the NHTSA here. It is straightforward to buy a Tesla and use Autopilot without ever realising this is an issue with Autopilot. What’s more, Tesla’s and Musk’s marketing and communications of Autopilot since 2016 have painted it out to be an almost fully capable self-driving system. Something which, clearly, it isn’t, and has sparked a DoJ probe into Tesla’s self-driving systems. The fact that the NHTSA has allowed Autopilot to be used outside its design scope for so many years shows its inability to correctly legislate the transport industry and, in the process, recklessly risk lives. It has created a legal loophole, where not restricting Autopilot to appropriate roads should be illegal, yet it isn’t.
But why does Tesla let Autopilot be used outside its safe parameters and instead rely on the driver to determine when it should be used?
Well, Tesla’s self-driving AI uses data from its drivers using Autopilot to train itself and get better at driving. In other words, if Tesla wants its self-driving AI to be able to eventually cope with hills, sharp corners, cross traffic or non-highway roads, it needs its customers to use it in these unproven areas. That way, it can gather data and hone in on correct driving patterns through trial and error.
However, notice how the risk-reward is being played here. Tesla drivers are, mostly unknowingly, risking their lives to develop a software for Tesla, who will massively benefit from it if it reaches a higher level of autonomy. The Tesla drivers have all the risk and none of the reward. But it isn’t just Tesla drivers; the wider road users are also at profound risk from this development gamble Tesla is playing.
Musk has repeatedly stated that Tesla’s high value is only valid because of its self-driving AI. If this AI turns out to be a dud, then Tesla would be worth far, far less. And, despite the obvious danger these crashes demonstrate, Tesla can’t restrict Autopilot to only being used on appropriate roads, as it would dramatically slow down (or possibly wholly halt) its development.
In other words, for Tesla to be worth so damn much, Musk and Tesla have to exploit this loophole and risk the lives of their customers and the general public. That should worry you deeply.
Now, why does Musk need Tesla to be worth so much? Well, he uses his shares in the company as collateral for multi-billion dollar loans to grow Tesla or buy social media platforms and destroy them from the inside out.
But it goes deeper. The Post’s evidence corroborates the DoJ investigation into Tesla. They believe Musk and Tesla mis-sold Autopilot as a fully-fledged self-driving system in all but name. This, in their eyes, led to people misusing it, such as using it on roads where it shouldn’t be, leading to multiple fatal crashes. Again, if the DoJ probe does charge Tesla, there is the potential for manslaughter and corporate negligence charges. At the very least, if Tesla is charged, the perceived value of Autopilot will diminish, and Musk’s precious money printer will disappear.
All of this is made even worse when you realise that Musk went against his engineer’s advice and purposely made Autopilot worse in 2021 by removing the ultrasonic sensors and radar to take Autopilot to “vision only” (so it solely used cameras to navigate). Multiple Tesla engineers have gone on record saying that this move increased the risk of crashes whilst using Autopilot and that they say data that proved it.
This was a blatant cost-saving measure by Musk to ensure Tesla’s outrageous profit margin remained despite competitors driving the price of Teslas down. These stupidly high profit margins are the other reason for Tesla’s high valuation. As such, Musk has to protect them and did so by making Autopilot cheaper to build. He carried on hyping up the ability of Autopilot, despite its now kneecapped ability, which seems to have led to the series of crashes that sparked the DoJ investigation.
Read more about this DoJ investigation here.
But it seems the NHTSA has finally realised they can close this loophole and save lives. Tesla recently had to recall 2 million US vehicles to install Autopilot safeguards after the NHTSA repeatedly raised safety concerns with them.
It turns out that the NHTSA has been investigating whether Tesla ensures drivers are paying sufficient attention whilst using Autopilot for two years. You see, Autopilot is classed as a driver assistance system, so even on appropriate roads, the driver has to pay attention, and Tesla is legally required to ensure they do so with safeguards. As such, to avoid charges from NHTSA, Tesla recently had to install additional safeguards in 2 million vehicles! However, Tesla themselves have stated that even after the recall, Autopilot’s software system controls “may not be sufficient to prevent driver misuse” and could increase the risk of a crash. Which is a blatant legal safety net! Especially when you consider that the NHTSA will keep its Autopilot probe open as it monitors the recall to ensure it actually makes any difference. They are on to Tesla and are looking to clamp down.
While this recall doesn’t completely resolve the “use on inappropriate roads” problem, it could improve it. After all, if Tesla is forced to have more solid safeguards in place to ensure the driver is actually paying attention and actually driving, accidents like the ones in these investigations won’t happen again. It also means that Autopilot might have to make it obvious to drivers whether the road they are on is appropriate for using Autopilot, as such information is crucial for the driver to understand what they should expect the system to do. However, I wonder if Tesla will implement this level of safeguarding without being forced to, as it would severely hurt Autopilot’s perceived value and could hurt its development.
This new evidence from the Washington Post, as well as the NHTSA’s crackdown on Tesla’s lapse approach to road safety, shows that Tesla and Musk are happy to risk their Customers and the general public’s lives to inflate their stock value and secure loans in the order of billions of dollars. What a damning indictment of modern capitalism…
Thanks for reading! Content like this doesn’t happen without your support. So, if you want to see more like this, don’t forget to Subscribe and follow me on BlueSky or X and help get the word out by hitting the share button below.
Sources: The Washington Post, Planet Earth & Beyond, Reuters, Reuters, Enterprise AI