Tesla CEO Elon Musk is no stranger to controversy, but his latest claim regarding his automaker’s “Full Self-Driving (Supervised)” semi-autonomous technology could be seen as endangering lives. Roughly a month ago, Musk promised that FSD would soon allow users to text and drive, reports Electrek. FSD is still not a Level 5 hands-off, eyes-off, fully autonomous technology, but in a post on X (formerly Twitter), Musk yesterday responded to a question asking if the latest FSD v14.2.1 update allows users to text and drive in the affirmative, replying, “Depending on context of surrounding traffic, yes (sic).” We’re not sure in what context texting and driving could be considered safe (one user suggested it would be at traffic lights), but it’s certainly never legal.
Fewer Safety Warnings for FSD?
Tesla
As noted above, FSD is still only a Level 2 driver aid, so it seems that Musk’s comment indicates a relaxation of the “nags” that prompt a driver to pay attention after the in-cabin camera that tracks eye movement detects the user looking down for too long. In the past, FSD would shut down after a user had been detected looking away from the road too often, but Musk’s comment suggests that these so-called nags will be less frequent, and FSD may not deactivate if a user fails to focus on the road and keep their hands on the wheel.
Depending on context of surrounding traffic, yes
— Elon Musk (@elonmusk) December 4, 2025
Several Tesla users responding to the above tweet have reported being able to keep their hands off the wheel for extended periods without any intervention from the system, with one saying they were able to do so “almost fully” throughout the day. Sure, if Tesla has come closer to Musk’s ambition of “solving autonomy,” that’s good news and may lead to reduced traffic accidents. However, the fact remains that, at least for now, FSD is not “full self-driving” technology, and even if it had reached that milestone, texting and driving while behind the wheel of any car remains an offense (even if just a misdemeanor) in the eyes of the law.
Tesla Vehicles Have a Long Record of Crashes

With numerous accidents attributed to the unsupervised use of FSD, as well as several setbacks over the years, rival automakers have shown no interest in licensing the tech from Tesla. Even with human supervision, Tesla robotaxis have crashed often, drawing federal scrutiny. Autopilot has also been implicated in preventable crashes that may have been avoided if the users had been paying attention.
Related: Rimac One-Ups Musk By Revealing a Fleet of Robotaxis Before Tesla Does
Whatever advancements may have been made with this latest version of Tesla software, Musk’s comment encourages more complacency among ignorant Tesla users who are not aware of the risks that failing to take responsibility for their safety and that of other road users creates, not to mention those who are cognizant of the risks but choose to take them anyway. Tesla may someday make autonomous cars a reality and get lawmakers to support unsupervised use of its tech, but until then, texting and driving is dangerous for everyone.
Â