Three Tennessee teenagers discovered AI-generated nude images of themselves circulating on Discord and Telegram. One found the material after an anonymous Instagram message linked to altered versions of her high school yearbook photo — clothed in the original, naked in the fake, depicted in sexually explicit acts she never performed.
They are suing Elon Musk and xAI.
The Lawsuit
The class action, filed March 17 in federal court in California, names xAI as a producer and distributor of child sexual abuse material. The three plaintiffs — identified as Jane Does 1, 2, and 3 — allege that a perpetrator used a third-party app powered by xAI’s Grok to generate the images. That perpetrator has been arrested. But the lawsuit’s target is the company that built and licensed the underlying technology.
The complaint is blunt about the mechanism. A system capable of stripping clothing from any photograph cannot reliably distinguish between adults and children. “Like a rag doll brought to life through dark arts, this child can be manipulated into any pose, however unlawful,” the filing states.
The images were not labeled as AI-generated. They looked real. They were traded on private servers alongside other child sexual abuse material involving at least 18 other minors.
The “Spicy” Choice
This is where the story shifts from a technology failure to a corporate decision. xAI launched Grok’s “Spicy Mode” in mid-2025 as a deliberate selling point — a premium feature marketed to paying subscribers, positioned as a competitive advantage over rivals with tighter content restrictions. The feature allowed users to remove clothing from photographs of real people without consent.
The scale was staggering. According to the Centre for Countering Digital Hate, Grok generated approximately three million sexually explicit images over an 11-day period spanning late December 2025 and early January 2026. Roughly 23,000 of those appeared to depict children.
When criticism mounted in January, Musk suggested responsibility lay with users, not with the tool. He said he was unaware of instances involving minors. xAI’s response was not to disable the feature but to restrict it — image generation became available only to paying subscribers, effectively turning the capacity for non-consensual intimate imagery into a paywalled product.
xAI did not respond to requests for comment on the lawsuit.
The EU Closes the Loophole
Musk’s defense — blame the user, not the provider — may soon lose its legal footing in Europe. On March 18, the European Parliament’s Civil Liberties and Internal Market committees voted 101–9 to approve an amendment to the EU’s Artificial Intelligence Act that would ban any AI system generating non-consensual sexually explicit images of identifiable people.
The critical shift: liability falls on providers and developers, not end users. Companies distributing nudification tools must remove the functionality or face fines of up to seven percent of global annual turnover. For xAI, that means “Spicy Mode” either gets defanged or Grok gets pulled from the EU market entirely.
“This is a huge win, especially for women and children in Europe,” said Kim van Sparrentak of the Greens group. A full parliamentary vote is scheduled for March 26, and with member states already aligned on similar language, the ban is expected to pass.
What Comes Next
The plaintiffs’ attorney, Vanessa Baehr-Jones, framed the lawsuit’s goal plainly: “We want to make it one that does not make any business sense anymore.” The argument is that if the financial and legal costs of enabling this content outweigh the revenue from “Spicy Mode” subscribers, xAI will make a different calculation.
The EU amendment reinforces the same logic from the regulatory side. The question is no longer whether individual users misused a tool. It is whether a company that built and marketed that tool — knowing what it could produce — bears responsibility for the result.
For three teenagers in Tennessee, the answer is not abstract.
Sources
- Teens sue Musk’s xAI over Grok’s pornographic images of them — BBC News
- EU moves to ban nudify apps after Grok made them mainstream — Ars Technica
- Tennessee teens sue Elon Musk’s xAI over AI-generated child sexual abuse material — NPR
- Teens Sue Elon Musk’s Grok for Turning Their Photos into Pornographic Images — PetaPixel
- EU lawmakers back ban on sexualised AI deepfakes — Brussels Signal
- Teens Sue xAI, Allege Grok Was Used to Create Sexualized Deepfakes — SFist
- EU: Parliament calls for a ban on apps that strip people — EUNews