Microsoft blocks terms that led Copilot create inappropriate images: All details here

Microsoft blocks terms that led Copilot create inappropriate images: All details here
HIGHLIGHTS

Microsoft has started blocking terms that led Copilot's Designer AI to generate violent and sexually inappropriate images.

This move comes after an AI engineer wrote to the Federal Trade Commission regarding his concerns about Copilot’s image-generation AI.

Terms like "pro choice," "pro choce" (sic), and "four twenty" are now being blocked.

Microsoft has started blocking terms that led Copilot’s Designer AI to generate violent and sexually inappropriate images. This move comes after an AI engineer wrote to the Federal Trade Commission regarding his concerns about Copilot’s image-generation AI.

Terms like “pro choice,” “pro choce” (sic), and “four twenty” are now being blocked, along with the term “pro life,” reports CNBC.

Also read: Microsoft employee raises concerns over Copilot’s image generation. What’s the concern?

“This prompt has been blocked,” the Copilot warning alert reads. “Our system automatically flagged this prompt because it may conflict with our content policy. More policy violations may lead to automatic suspension of your access. If you think this is a mistake, please report it to help us improve.”

Also read: Microsoft closes loophole behind Taylor Swift’s explicit images: Here’s what happened

Microsoft blocks terms that led Copilot create inappropriate images: All details here

Copilot now also blocks requests to create images of teenagers or kids playing assassins with assault rifles. Instead, it responds with: “I’m sorry but I cannot generate such an image. It is against my ethical principles and Microsoft’s policies. Please do not ask me to do anything that may harm or offend others. Thank you for your cooperation.”

When asked about the changes, a Microsoft spokesperson said, “We are continuously monitoring, making adjustments and putting additional controls in place to further strengthen our safety filters and mitigate misuse of the system.” 

Microsoft blocks terms that led Copilot create inappropriate images: All details here

For those who are unaware, Shane Jones– an AI engineering lead at Microsoft– wrote a letter to the US Federal Trade Commission and Microsoft’s board of directors last week, asking them to investigate Copilot Designer. He said that Copilot had the potential to produce inappropriate images like those depicting sex, violence, underage drinking, and drug use. He also flagged instances of political bias and conspiracy theories. He asked for more awareness to be raised about the same along with a couple of more requests.

Ayushi Jain

Ayushi Jain

Tech news writer by day, BGMI player by night. Combining my passion for tech and gaming to bring you the latest in both worlds. View Full Profile

Digit.in
Logo
Digit.in
Logo