Women sues OpenAI claiming ex-boyfriend is harassing her using ChatGPT

HIGHLIGHTS

A woman is suing OpenAI, alleging ChatGPT played a role in her harassment.

The lawsuit claims the chatbot framed the woman as unstable, which led to real-world stalking.

OpenAI allegedly ignored warnings and restored the person's account even after a safety flag.

Cyber crimes, unfortunately, have been causing many issues in the world. And ever since the advancements brought to AI, deepfakes have taken these crimes one step ahead. These days, people have found new ways to harass others, and a new sort of crime has emerged that has shocked everyone. Recently reported by TechCrunch, a woman from California has filed a lawsuit against OpenAI, alleging that the company’s ChatGPT platform played a direct role in directing her ex-boyfriend to stalk and harass her. Not only that, but she also claims that OpenAI repeatedly ignored warnings that the man posed a real-world threat. Let’s take a look at what happened. 

Also Read: Anthropic under scrutiny as Claude flags users as minors, here is how to unlock your account

How ChatGPT was used for harassment

The victim claims that the person had broken up with the ex-boyfriend in 2024, and after their split, he used ChatGPT to process the breakup. But rather than trying to comfort the person, the chatbot repeatedly framed the victim as manipulative and unstable. The woman claims that these AI-generated conclusions were then used to justify real-world harassment.

What makes this case particularly more intriguing is that OpenAI’s own automated systems raised alarms. Last year, OpenAI’s automated safety system flagged him for “Mass Casualty Weapons” activity and deactivated his account. But later, a human safety team member reviewed the account the next day and restored it.

Back in November last year, when the victim filed a complaint under the Notice of Abuse to OpenAI, the company responded, acknowledging the report was “extremely serious and troubling” and that they would be carefully reviewing the information. But the victim never heard back from her.

What the victim is seeking

As of now, the victim has filed a temporary restraining order asking the court to force OpenAI to block the user’s account and prevent him from creating new ones. Not only that, but she also wants the company to inform her if he attempts to access ChatGPT and preserve his complete chat logs for discovery. While OpenAI has not publicly responded to the lawsuit, they’ve suspended the user’s account, but they won’t meet the other demands. 

The lead attorney on the case said, “In every case, OpenAI has chosen to hide critical safety information — from the public, from victims, from people its product is actively putting in danger.” Furthermore, suggesting that, “We’re calling on them, for once, to do the right thing. Human lives must mean more than OpenAI’s race to an IPO.”

Also Read: OpenAI warns of potential security issue, urges Mac users to update these apps

Madhav Banka

Madhav works as a consultant at Digit, covering news, branded and feature stories. He has been writing about tech and video games since 2020. While not busy working, you'll usually find him roaming around Delhi in hopes of getting good pictures, playing video games or watching films and F1 during weekends.

Connect On :