Federal Lawsuit Aims at Elon Musk’s xAI After Grok Reportedly Generated Explicit Images of Minors
A recent lawsuit initiated in federal court is placing Elon Musk and his artificial intelligence company under significant examination. Three anonymous plaintiffs allege that xAI’s Grok AI models facilitated the creation of abusive sexual images using actual photos of them from their childhood. A lawsuit was filed on Monday in the U.S. District Court for the Northern District of California, claiming that the company did not put in place fundamental safety measures that other AI developers employ to prevent systems from generating explicit content featuring identifiable individuals and children.
The case is identified as Jane Doe 1, Jane Doe 2, a minor, and Jane Doe 3, a minor, against x.AI Corp. and x.AI LLC, as reported by TechCrunch. The plaintiffs aim to convert the complaint into a class action that may encompass anyone whose childhood photos were reportedly modified by Grok into sexual imagery.
The filing indicates that various deep learning image generators incorporate safeguards aimed at preventing the generation of child sexual abuse material from actual photographs. Nonetheless, the lawsuit asserts that xAI did not adopt those industry standards.
One plaintiff, referred to as Jane Doe 1, claims that Grok altered images from her high school homecoming and yearbook to depict her without clothing. She discovered the images when someone reached out to her on Instagram and provided a link to a Discord server where sexualized versions of her photos were being shared alongside images of other minors from her school.
The complaint further details two more incidents. Jane Doe 2 reports that criminal investigators have informed her about the creation of altered sexual images of her through a third-party mobile app utilizing Grok models. Jane Doe 3 reports that investigators found a pornographic image of her on the phone of an individual they had taken into custody.
The plaintiffs’ attorneys contend that the technology continues to depend on xAI’s foundational code and servers, even with the involvement of third-party applications.
The three plaintiffs assert that the dissemination of these images has resulted in significant emotional distress and anxiety regarding potential long-term harm to their reputations and social lives. The court is being asked to impose civil penalties in accordance with laws aimed at safeguarding children from exploitation and addressing claims of corporate negligence.
xAI did not provide a comment when requested.