A test involving a Tesla running its Full Self-Driving software is raising tough questions about whether this type of electric vehicle technology is really ready for public streets.
The test, per FuelArc, showed the car blowing past a school bus stop sign and running over a child-sized dummy in the road, despite clearly detecting it. With Tesla set to launch a fleet of self-driving taxis in just weeks, safety advocates say these issues can't be brushed aside. And for many people, it's hard to feel comfortable with the idea of cars making life-or-death decisions on their own.
Tesla FSD 13.2.9 will still run a kid down while illegally blowing past a stopped school bus with red lights flashing and stop sign extended. Elon Musk how many kids does FSD need to run over before you fix this? I warned about this danger at the 2023 Super Bowl!
— Dan O'Dowd (@realdanodowd.bsky.social) May 28, 2025 at 8:02 PM
[image or embed]
What's happening?
In the video posted by the artificial intelligence safety group The Dawn Project, a Tesla Model Y equipped with FSD version 12.3.6 fails to stop for a school bus with flashing lights and a deployed stop arm. A small child-sized dummy steps onto the road, and although the car labels it as a "pedestrian" on-screen, it doesn't slow down or try to avoid it — instead, it just drives through.
The group said the test is meant to highlight real risks as Tesla prepares to roll out self-driving taxis in Austin, per KVUE.
Tesla said the Cybercab will operate within a limited zone, but critics say this kind of failure could easily happen in the real world. A similar incident in North Carolina, reported by WTVD, has been under investigation after a student was hit by a Tesla while crossing the road after getting off a bus.
Why is reliable software important?
Not stopping for a school bus is a clear and dangerous failure. It's one of the most basic rules of the road, and the fact that the Tesla identified the pedestrian but didn't respond has some experts worried that the system still doesn't fully understand its surroundings. These mistakes can cost lives, and if the software isn't reliable in simple situations, it's hard to imagine it handling more complicated ones.
There's also concern that launching self-driving cars too quickly could shake public trust in the technology. Some companies, such as Waymo, use different systems, like lidar, that rely on more than just cameras. Tesla doesn't, which may explain some of the differences in how its vehicles respond.
What's being done about the standards?
The Dawn Project is calling on regulators to stop the robotaxi launch until more thorough safety checks are done. Some researchers and engineers are urging federal agencies to set clearer standards for what "self-driving" cars need to demonstrate before hitting public roads and making the public feel comfortable with them. Others say that even if the technology eventually gets there, it still needs more testing and that companies shouldn't use the public as their test track.
For now, these videos are prompting a growing number of people to ask whether safety is being treated as a top priority or if excitement about new tech is getting too far ahead of what the systems are ready for.
Would you trust a self-driving car on a busy city street? Click your choice to see results and speak your mind. |
Join our free newsletter for good news and useful tips, and don't miss this cool list of easy ways to help yourself while helping the planet.