• Business Business

Woman recounts dehumanizing experience after discovering disturbing use of Elon Musk's Grok AI: 'Women are not consenting to this'

"It felt as violating as if someone had actually posted a nude or a bikini picture of me."

Photo Credit: Getty Images

Generative artificial intelligence tools have made the practice of nonconsensual digital image manipulation faster and easier to scale. According to the BBC, one woman has discovered realistic altered images circulating online without her knowledge or approval. 

What's happening?

According to the report, freelance journalist Samantha Smith said she felt "dehumanized and reduced into a sexual stereotype" after users on the social platform X prompted Grok, its AI chatbot created by xAI, to digitally remove clothing from images resembling her. 

Smith said the altered images were generated without her consent and shared publicly throughout the platform. The BBC said it reviewed multiple examples of users asking Grok to place women into bikinis or sexualized scenarios. xAI did not respond in depth to BBC's request for comment, but it did send an automated reply stating "legacy media lies." 

Smith made a post about her image being altered and was met with more images being generated, along with others who have experienced the same harassment. 

"Women are not consenting to this," Smith said to the BBC.

Grok has found itself in other hot water around nonconsensual image production as well; according to a statement from the company via the Guardian, Grok may have violated U.S. law around the production of sexual content featuring minors by generating inappropriate images of children at its users' behest.  

Why is taking action to reduce this crime important?

A law professor at Durham University, Clare McGlynn, told the BBC that platforms could prevent this kind of abuse but fail to act. It's been documented that AI systems can amplify harmful behavior and patterns when safeguards are weak, including reinforcing bias and enabling misuse at scale. 

Unchecked AI use can normalize behavior, like the spread of misinformation or bias for dangerous practices like poaching, that would otherwise be widely condemned, so there is an additional social cost. 

AI systems are also closely tied to the energy grid, according to data compiled by AllAboutAI. Training and running large AI models require significant electricity and water use, which strains local resources and drives up energy costs. At the same time, AI can help optimize clean energy systems when used responsibly. 

What's being done about protections?

A United Kingdom Home Office spokesperson is seeking to ban the use of AI to undress others and to make it a criminal offense for anyone who supplies such tools. Ofcom told the BBC that creating or sharing nonconsensual intimate images, including AI-generated ones, is illegal in the U.K., and platforms have a duty to reduce the risk of harassment for their users. 

Although the burden should be on these companies, if you want to help reduce the AI risks, one way is to switch to a bank that prioritizes sustainable and ethical practices while promoting responsible AI use. 

"While it wasn't me that was in states of undress, it looked like me and it felt like me and it felt as violating as if someone had actually posted a nude or a bikini picture of me," Smith said, per the BBC. 

What do you think of Tesla and Elon Musk?

Elon is the man 🥰

Love the company; hate the CEO 🚗

I'm not a fan of either 🙅

I don't have an opinion 🤷

Click your choice to see results and speak your mind.

Get TCD's free newsletters for easy tips to save more, waste less, and make smarter choices — and earn up to $5,000 toward clean upgrades in TCD's exclusive Rewards Club.

Cool Divider