• Business Business

Officials open investigation into nearly 3 million Teslas over self-driving violations: 'Proceeding through red lights, and stopping at green lights'

"[Tesla] doesn't want to fix it, or even acknowledge the problem, even though they've done a test drive with me and seen the issue with their own eyes."

"[Tesla] doesn’t want to fix it, or even acknowledge the problem, even though they’ve done a test drive with me and seen the issue with their own eyes."

Photo Credit: Depositphotos.com

The controversies surrounding Tesla's self-driving technology keep on coming. 

The National Highway Traffic Safety Administration announced on Oct. 7 that it was initiating an investigation into Tesla's Full Self-Driving after receiving dozens of reports of vehicles engaging in moving violations while FSD was operational. 

"The Office of Defects Investigation ('ODI') is opening this Preliminary Evaluation (PE) to assess the scope, frequency, and potential safety consequences of FSD executive driving maneuvers that constitute traffic safety violations," NHTSA said. 

What's happening?

The investigation will involve all Teslas equipped with either FSD (Supervised) or FSD (Beta), which totals nearly 2.9 million vehicles, per NHTSA. 


In the market for a home EV charger? Qmerit makes it easy to get instant quotes on Level 2 charging stations that can save you hundreds of dollars per year.

To get an instant estimate, just answer a few questions about your garage and electrical panel. Within a few days, Qmerit will contact you with a final proposal from a certified installer, and their expert electricians make the process a breeze from there.

The Cool Down may receive a commission on signups made through links on this page, but we only promote partners we vet and believe in. For more cool tips like this one, check out our solutions marketplace here.

The problems cited include potentially life-threatening violations such as "proceeding through red traffic signals and driving against the proper direction of travel on public roadways," according to NHTSA. 

Additionally, drivers have complained about FSD's failure to stop at railroad crossings, among other serious issues. 

According to CNBC, one of the complaints to NHTSA involved a Houston driver, who in 2024 complained to the agency that FSD "is not recognizing traffic signals," saying that "this results in the vehicle proceeding through red lights, and stopping at green lights." 

"Tesla doesn't want to fix it, or even acknowledge the problem, even though they've done a test drive with me and seen the issue with their own eyes," the complaining driver said, per CNBC. 

Despite long-standing issues with the technology, Tesla has boasted that its Full Self-Driving "will drive you almost anywhere with your active supervision, requiring minimal intervention," according to CNBC.

The investigation into FSD came about a month after Ed Markey and Richard Blumenthal, two U.S. Senators, requested NHTSA take a closer look at the problematic technology, per CNBC. 

Why is it important?

Proponents of autonomous vehicles have long claimed that the technology has the potential to be safer than human drivers. However, the complaints logged to NHTSA, as well as other incidents, have clearly demonstrated that self-driving technology still has a long way to go to reach that standard. 

Indications are that failures are not rampant, and it's important to note that Tesla still labels FSD as requiring driver supervision. The suit positions the situation by making the case that FSD increases risk, though, over operating without the software. In the times that FSD does fail, the logic goes, it not only puts the drivers and passengers inside the vehicle itself at risk of serious injury or death but also poses a threat to those in other vehicles, as well as pedestrians, cyclists, and anyone else in the vicinity. 

Failures of autonomous-driving technology also raise the still-unanswered question of who is liable when the technology causes a serious incident.

As lawmakers and regulators have failed to keep pace with the technology's advancements and its rollout on public roadways, juries have begun to have their say as to who they deem accountable. 

In one high-profile case, a Florida jury ordered Tesla to pay $243 million in damages for an incident involving a Tesla operating on Autopilot — a precursor to FSD — that killed one person and left another seriously injured. 

What's being done about it?

With the NHTSA investigation having been initiated and the pressure from U.S. Senators and others increasing, there is hope that Tesla will be forced to rein in its Full Self-Driving technology and be held accountable for its failures. 

In the meantime, drivers need to be aware of the technology's shortcomings and remain vigilant when operating vehicles, even those equipped with so-called "autonomous" features. 

What are your thoughts on fully automated vehicles?

I want one ✅

They're interesting in theory 🤔

I don't trust them ❌

I have no idea 🤷

Click your choice to see results and speak your mind.

Join our free newsletter for good news and useful tips, and don't miss this cool list of easy ways to help yourself while helping the planet.

Cool Divider