Tesla Fails School Bus Test In Disturbing Fashion As It Slams Into Child-Sized Dummies
The test, called The Dawn Project—spearhead by Tesla Takedown and ResistAustin—simulated a common road scenario: a school bus stopped with its red lights flashing and stop sign extended, and a child attempting to cross the street. Video from the demonstrations shows a Tesla Model Y, operating with the latest FSD software version 13.2.9, failing to recognize and react appropriately to the stopped bus. In each of the eight runs, the vehicle disregarded the flashing lights and stop sign, proceeding to collide with child-sized dummies that were dragged into the vehicle's path.

Dan O'Dowd, founder of The Dawn Project, remarked: "Self-driving software that illegally blows past stopped school buses and runs down children crossing the road must be banned immediately." O'Dowd further emphasized that "Tesla's failure to address this demonstrates Elon Musk's utter negligence and contempt for public safety."
This isn't the first time Tesla's FSD has faced such scrutiny. The Dawn Project has been highlighting this specific safety concern for years, even running a Super Bowl commercial in February 2023 to draw attention to the issue. A month later, a real incident in North Carolina saw a self-driving Tesla illegally pass a stopped school bus and strike a child, resulting in a fractured neck and broken leg.
While Tesla maintains that FSD requires active driver supervision, these tests illustrate a profound failure in the software's ability to handle critical, foreseeable hazards. Critics argue that relying on human intervention for such fundamental safety protocols undermines the very premise of "Full Self-Driving."
The timing of this test on June 12 is particularly coincidental as Tesla had aimed to launch its robotaxi service in Austin the very same day (which has now been pushed to June 22). The company has been testing driverless Model Ys in the city, leading to concerns that a system demonstrating such critical flaws in controlled environments could pose significant risks in uncontrolled, real-world scenarios without a human driver to intervene.