Discussion about this post

User's avatar
DrBDH's avatar

I appreciate substituting “error” for “hallucinations.” And the use of “probability machine.” Let’s further eliminate inappropriate anthropomorphism of AI by not using “understanding” instead of “modeling.” We can only put AI in its rightful place (and deflate the AI bubble) by insisting it is nothing like human intelligence.

Expand full comment
NottaScotta's avatar

Manned spaceflight is another place Musk has decided needs to have its rules changed to accommodate the occasional fatality, which as long as it's not Musk, is OK with Musk. He seems fine redefining safety for YOU.

The boy seems unacquainted with transients, like fog, visual blocks, camera saturation, sensor conflicts, ambiguous visual stimuli, reaction times, and most importantly, failsafe reactions.

People are already DEAD because his cars fail to have exit features when things break. Real people, really dead. Traceable to Tesla design decisions that involve NO motion.

I'm human, with 50+ years of driving experience and each time I do a long interstate trip, SOMETHING comes up that defies reason. Objects, animals, debris, humans, tires in the road are a thing. Newly placed Jersey barriers and highway workers are a thing. Sheets of semi-load tarps are a thing. Temporary lane reassignments are not only a thing, they are a thing with some serious latency on any conceivable map. And of course, the ever popular orange cone is being deployed as one drives by is a thing. Hell, the mere appearance of a Florida, NY, Massachusetts or Connecticut tag alone qualifies as a potential lethal hazard on ANY highway!

Trusting software hobbled by unreasonable architectural constraints is asking for trouble. The highway is not a lab. It's not the simple task of transiting from A to B in perfect light on smooth pavement in excellent weather that is the trick. It's the billion permutations of transients that matter, and at the very least, determining the situation a moving car encounters is primary. That's what WE have to do as humans and WE make mistakes. Sometimes fatal mistakes. Musk seems OK with merely automating fatal mistakes. With a pending trillion dollar pay package in process, the occasional dead citizen struggles to compete for his attentions.

Establishing the situation with negligible error requires sensor variety and fusion. FSD cannot even navigate a single town in Texas. It clearly breaks laws a teenaged student driver avoids.

And remember... the humans on the road with you are sometimes drunk, and even on the interstate, occasionally are going the wrong way with closure speeds of 140 MPH.

Anyone who has ever driven through Boston knows that even the signs can't be trusted and that it is a complete crap shoot to navigate the uncertainties you will CERTAINLY encounter without dying in the process. At least in the city, the collision speeds are slower. Sadly, the million or so unpredictable drivers, pedestrians, and road conditions demand perfection, and single sensor and sensor-type reliance will not provide adequate scene coverage in time frames needed to arrest motion or redirect a guided projectile.

Expand full comment
2 more comments...

No posts