“Mad Max mode” may sound like something out of a video game, but it is a real-life setting for cars currently plying America’s streets. And it poses genuine danger.
In an homage to the main character from George Miller’s dystopian 1979 film and its sequels, originally portrayed by current Trump supporter Mel Gibson, Tesla created Mad Max mode as an option for vehicles equipped with its “Full-Self Driving” (FSD) system. The Mad Max icon is a mustachioed smiley face wearing a cowboy hat, bearing less of a resemblance to the film’s titular vigilante than to Tesla CEO Elon Musk’s brother, Kimbal. (Warner Bros., which released the films, has not filed suit.)
Despite its name, FSD does not enable the car to drive itself. Rather, it is an advanced driver-assistance system (ADAS), capable of changing lanes, making turns, and adjusting speed as long as a human driver remains alert and ready to take over. Other automakers, such as Ford and GM, also offer ADAS systems.
Mad Max mode is starkly different from other FSD settings like “Sloth” and “Chill.” Teslas using it will roll through stop signs and blast past other vehicles on the road. One driver posted a YouTube video showing his Mad Max-enabled Tesla hitting 82 mph while whizzing by a 65 mph speed limit sign. A social media user wryly suggested that Mad Max “should just immediately write you a ticket when you turn it on.”
Tesla made Mad Max mode available briefly in 2018 and then reintroduced it in October. The National Highway Traffic Safety Administration quickly announced a safety investigation; the agency declined to give an update on its status.
Musk’s company is not the only one programming its vehicles to treat traffic laws as suggestions rather than requirements.
Waymo’s robotaxis (which, unlike ADAS such as Tesla FSD, do not require anyone in the front seat) have been spotted in San Francisco blocking bike lanes and edging into crosswalks where children are walking. In a recent Wall Street Journal story titled “Waymo’s Self-Driving Cars Are Suddenly Behaving Like New York Cabbies,” a Waymo senior director of product management confirmed that the company has programmed its cars to be more aggressive. He said that recent adjustments are making its robotaxis “confidently assertive.”
Welcome to our brave new computer-powered future, where companies will determine which road rules are obeyed and which are ignored. We might not like what they decide.
Mad Max, unleashed
Traffic laws occupy a curious niche in the U.S., where most drivers break them regularly and without consequences.
“There is this built-in acknowledgment that going 5 miles per hour over the limit is okay,” says Reilly Brennan, a partner at Trucks Venture Capital, a transportation-focused investment firm. “In other parts of our life, that wouldn’t be acceptable, like going 5% over in accounting or when a doctor performs some kind of task.”
Indeed, many otherwise law-abiding drivers occasionally change lanes without using a turn signal or double park while grabbing coffee, knowing that these behaviors are technically illegal, but believing they are unlikely to result in a crash or fine.
Driving more than 25 mph over the speed limit is a different story. Most people avoid doing so unless, say, rushing a child to the hospital, given the risk of getting into a crash or receiving a pricey ticket.
But unlike humans, robotaxis and ADAS can violate traffic laws regardless of situational context. “You’ve taken away the agency of the person to decide whether it’s reasonable to break the law at that time,” says Phil Koopman, professor emeritus of computer science at Carnegie Mellon, who has studied autonomous driving extensively.
Furthermore, companies like Tesla and Waymo may be shielded from the consequences of both minor and major traffic violations. The driver of a Tesla running FSD, for instance, is expected to remain alert and ready to take over, and the company claims that the driver—not Tesla—is liable for mishaps or collisions.
“You have a company deciding to break the law, but the driver is being held responsible and suffering the consequences,” Koopman says. Last August, a Florida jury rejected Tesla’s attempts to pin crash responsibility on drivers alone, awarding $243 million to the family of a person struck and killed by a Tesla running Autopilot, the company’s less advanced ADAS. Tesla is appealing.
Producers of fully autonomous software shoulder more responsibility for their vehicles’ actions than car companies offering ADAS. Still, accountability isn’t a given for them, either. State law in California and Georgia currently does not allow police to ticket vehicles without a driver, though California will close that loophole next year. (A Waymo spokesperson said the company supported California’s change).
Everyone’s a road warrior now
Without liability for traffic law violations, companies may program their vehicles to take more risks. Tesla likely launched Mad Max mode to appeal to the company’s hardcore customers, says author and podcaster Edward Niedermeyer, who has written a book about the company’s history and is currently writing a follow-up.
“Tesla has a baseline incentive to release all kinds of weird, quirky, unique software updates that cost them almost nothing and fuel their online fan base,” he says. “Mad Max mode is an example of that, and it happens to also reflect the company’s casual attitude toward public safety.”
Waymo’s robotaxis do not behave nearly as aggressively as Teslas running Mad Max. But the company faces an incentive to turn its assertiveness dial up a bit, if only to match the expectations of its paying passengers, who have become accustomed to violating traffic laws when they themselves sit behind the wheel. Driving “like your grandmother”—as writer Malcolm Gladwell described his Waymo passenger experience in 2021—isn’t exactly a juicy marketing line.
“Consumers think that these systems should drive the way they drive,” Brennan says.
Some circumstances clearly call for rule-breaking, such as moving across a double yellow line to navigate around a moving van that is being unloaded. “What we’ve learned through more than a hundred million real-world miles is that appropriate assertiveness is crucial for safety and traffic flow,” says a Waymo spokesperson.
But other situations are trickier, such as dropping someone off in a crosswalk or bike lane when no parking spot is available. These behaviors may be common practice among human drivers, but they can endanger other road users and certainly inconvenience them. Last year, Waymo received 589 tickets for illegal parking in San Francisco.
But the public may have limited patience for computer-powered cars that bend traffic rules or cause collisions. Researchers have found that people are more tolerant of risk in activities they can control (like driving) than those they cannot (like robotaxis). Case in point: A recent outcry erupted in San Francisco after Waymo vehicles ran over a cat and dog. Of course, countless American pets are killed by human drivers, including the estimated 100,000 dogs who die annually after being placed in truck beds.
These tensions will not dissipate anytime soon, given how furiously makers of ADAS and autonomous vehicles are working to win over customers. Brennan envisions a future where riders might choose from varying levels of robotaxi assertiveness. “Right now, there is just one Waymo setting,” he says. But in a few years, there may be “three or four settings, and one of them is almost exactly like the way that you want to drive.”
For that to happen, humans will have to grow accustomed to self-driven cars zooming past speed limits and playing chicken with pedestrians in crosswalks. Companies are designing their autonomous systems to reflect how humans drive, for better and for worse.