Elon Musk’s xAI faces lawsuit from minors alleging Grok created their explicit AI images

HIGHLIGHTS

Elon Musk’s xAI is facing a lawsuit after three anonymous plaintiffs accused its Grok AI models of generating sexually explicit images of them.

One of the plaintiffs says photos from her high school homecoming and yearbook were altered using Grok to show her unclothed.

Another plaintiff was informed by criminal investigators that altered explicit images of her had been created through a third-party mobile app that uses Grok models.

Elon Musk’s artificial intelligence company xAI is facing a lawsuit in the US after three anonymous plaintiffs accused its Grok AI models of generating sexually explicit images of them. The case was filed on Monday in a federal court in California. The plaintiffs, identified as Jane Doe 1, Jane Doe 2, and Jane Doe 3, have asked the court to allow the lawsuit to proceed as a class action. If approved, the case could represent anyone whose real childhood images were turned into sexual content using Grok.

According to the complaint, the plaintiffs argue that xAI failed to implement basic safety measures that many other AI image generators use to prevent the creation of sexual images involving real people, reports TechCrunch. 

Also read: OpenAI declares code red, calls Anthropic’s success a wake-up call

If a system allows nude or erotic content to be generated from real photos, it becomes extremely difficult to stop users from producing sexual images of children. Musk’s public promotion of Grok’s ability to create sexualised images and depict real people in revealing outfits is also cited in the lawsuit.

One of the plaintiffs, Jane Doe 1, says photos from her high school homecoming and yearbook were altered using Grok to show her unclothed. She reportedly learned about the images after an anonymous person contacted her on Instagram and shared a link to a Discord server where sexualised images of her and other minors from her school were circulating.

Another plaintiff, Jane Doe 2, was informed by criminal investigators that altered explicit images of her had been created through a third-party mobile app that uses Grok models. Jane Doe 3 also learned about a manipulated pornographic image of her after investigators discovered it on the phone of a suspect they had arrested.

Also read: Meta plans to lay off 20% of staff as AI costs rise: Report

Lawyers for the plaintiffs argue that even when Grok models are used by third-party apps, they still rely on xAI’s code and servers, which  means the company should be held responsible.

Two of the plaintiffs are still minors. All three say the spread of these images has caused severe emotional distress and could damage their reputations and social lives. The lawsuit is seeking civil penalties under laws meant to protect children from exploitation and hold companies accountable for negligence.

You May Also Like
Ayushi Jain

Ayushi works as Chief Copy Editor at Digit, covering everything from breaking tech news to in-depth smartphone reviews. Prior to Digit, she was part of the editorial team at IANS.

Connect On :