Shane Jones, an engineer who has worked in Microsoft’s artificial intelligence department for six years, has filed a report with the US Federal Trade Commission raising concerns about safety issues with the company’s AI image generator.

According to Jones, Microsoft refused to take down Copilot Designer despite repeated warnings that the tool was capable of generating harmful images.

While testing Copilot Designer for safety issues and bugs, Jones discovered that the tool produced demons and monsters alongside abortion rights terminology, teenagers with assault weapons, sexualized images of women on violent boards, and underage drinking and drug use.

In addition, Copilot Designer allegedly generated images of Disney characters, such as Elsa from Frozen, in scenes in the Gaza Strip in front of “broken buildings and ‘free Gaza’ signs. It also generated images of Elsa wearing an Israeli Defense Force uniform holding a shield with the Israeli flag.”

Jones has been trying to warn Microsoft about the DALLE-3 model used by Copilot Designer since December. He posted an open letter on LinkedIn about the issues, but was reportedly contacted by Microsoft’s legal team to remove the post, which he did.

“Over the past three months, I have repeatedly called on Microsoft to remove Copilot Designer from public use until better safeguards can be put in place,” Jones wrote. “Again, they have failed to implement these changes and continue to market the product to everyone. Anywhere. Any device.”

Microsoft spokesman Frank Shaw said the company is committed to addressing all employee concerns in accordance with Microsoft policy. “When it comes to issues that could potentially impact our services or our partners, we have built in-product user feedback tools and robust internal reporting channels to properly investigate, prioritize and resolve any issues.”

Source: The Verge