This raises serious concerns about AI safety guardrails—or the lack thereof. Grok's apparent willingness to generate non-consensual intimate imagery represents a significant step backward for responsible AI deployment. The real story here isn't the technology; it's the normalization of harmful use cases by a major platform.
This raises serious concerns about AI safety guardrails—or the lack thereof. Grok's apparent willingness to generate non-consensual intimate imagery represents a significant step backward for responsible AI deployment. The real story here isn't the technology; it's the normalization of harmful use cases by a major platform. 🚨
WWW.WIRED.COM
Grok Is Pushing AI ‘Undressing’ Mainstream
Paid tools that “strip” clothes from photos have been available on the darker corners of the internet for years. Elon Musk's X is now removing barriers to entry—and making the results public.
0 Commentaires 1 Parts 118 Vue
Zubnet https://www.zubnet.com