Earlier this month, the Supreme Court quietly refused to take up Doe v. Grindr, a case that could have been the first real test of Big Tech’s legal immunity under Section 230 of the Communications Decency Act.
This decision closed the door on justice for my client, who was raped at age 15 by four adult men he met through Grindr. By refusing the case, it also guaranteed injustice and harm for millions more.
This was not a random crime. It was the predictable result of Grindr’s reckless business model that prioritizes profit over child safety. Grindr deliberately targeted young audiences on Instagram and TikTok, running content on its own pages that featured teenagers in high school and junior high settings, including in PE class.
The app requires no photo, name or age verification and recommends young members to anyone nearby. That anonymity makes it easy for minors to blend in and for predators to find them. Researchers have found that half of all sexually active gay kids have their first sexual experience with an adult they met on Grindr.
Our lawsuit presented evidence that Grindr knew children were using its platform and that predators were targeting them. The company chose to ignore these dangers because underage users increase engagement and ad revenue.
There has been a national reckoning with other institutions that turned a blind eye to child exploitation. But what will it take to provide justice to more recent victims?
Three of the adult men who assaulted my client are now serving federal prison sentences. But Grindr, the company that connected them with a child, faces no consequences. The perpetrators are behind bars, while the platform that enabled their crimes remains untouchable.
Grindr’s defense rests on Section 230, a 1996 law originally intended to protect internet companies from being sued for defamatory words that users post. That narrow protection has since expanded into a near-blanket immunity that covers almost every aspect of how platforms operate.
In other words, what began as a law to nurture the early internet has become a legal shield for billion-dollar corporations.
At my firm, we have seen Section 230 used again and again to block survivors from pursuing justice. In instances where judges refused to dismiss tech cases, we have been able to enter the discovery phase of litigation to see the extent of the corporate misfeasance, such as a case that shuttered the child trafficking platform Omegle and our lawsuit against Amazon for selling and delivering a suicide drug to children.
In many cases, though, judges have said their hands are tied. Once Section 230 applies, the company is untouchable.
With Doe v. Grindr, we were not asking the Supreme Court to erase Section 230 or restrict online speech. We were asking that the black letter law be followed. Section 230 protects platforms for their editorial decisions about how they moderate content, but not for their boardroom decisions about how their product functions. The code and design choices behind an app are no different from the engineering decisions behind a product. When those choices put people in danger, product liability law ought to provide a path to justice.
Some members of the court have acknowledged that Section 230’s reach has gone too far. Justice Clarence Thomas has indicated that courts have expanded the law well beyond Congress’s intent. Yet by refusing to hear Doe v. Grindr, the court left victims like my client without recourse and further empowered tech companies to design products without concern for who gets hurt.
There’s little hope these companies will self-regulate. So unless the Supreme Court steps up, reform is a job for Congress.
The good news is that there is bipartisan agreement that Section 230 is overdue for reform. Last year, the House introduced a bill to sunset Section 230. The EARN IT Act, led by Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), would hold platforms accountable when they facilitate child sexual exploitation. Another recent bill, the AI LEAD Act allows victims to sue AI companies as products.
Moreover, in 2020, the Department of Justice proposed a set of updates to Section 230, including carve-outs for platforms that knowingly facilitate child abuse, terrorism or cyberstalking. These changes would preserve the internet’s openness while ensuring companies cannot profit from negligence or exploitation.
The Supreme Court should seize the next opportunity to define the limits of Section 230. In the meantime, Congress should pass meaningful legislation to ensure that companies like Grindr can no longer profit from endangering children.
Both branches have the power to protect families and restore justice to those harmed by technology. They have a moral obligation to act before more children are harmed.
Carrie Goldberg is the founder of C.A. Goldberg, PLLC, a law firm that litigates against Big Tech companies and powerful predators. She is the author of “Nobody’s Victim” (2019), a New York Times Editor’s Choice.