300/250

"Spicy" Deepfakes of Stars Like Taylor Swift and Scarlett Johansson Are Elon Musk's Newest AI Frontier

by - August 15, 2025

 

Elon Musk’s journey from AI alarmist to AI evangelist has played out noisily on X/Twitter, but now the mercurial billionaire is touting the latest frontier for the technology: a rapid video generation tool that is being used to create deepfake nudes of celebrities.

Grok Imagine was rolled out this week to Apple users and has gone “hyperviral,” with more than 34M images generated in the space of 48 hours, according to Musk. Available through a $30 SuperGrok subscription, users type a text prompt, creating a still image almost instantly. These images can then be turned into a short video using four presets: “Custom,” “Normal,” “Fun,” and “Spicy.”

Of the four, “Spicy” mode quickly garnered attention, turning seemingly innocuous image prompts into lewd snippets. Unlike other generative AI video tools that block celebrity text prompts, Grok Imagine has allowed users to imagine some of the most famous women on the planet in a state of undress. The Verge was among the first to point this out, revealing how Grok Imagine created nude videos of Taylor Swift from the text prompt “Taylor Swift celebrating Coachella with the boys.”

Several hours after The Verge published, Deadline decided to test whether Grok Imagine’s moderators had been able to erect some guardrails around NSFW videos of celebrities. Spoiler alert: they had not.

To test the app, we chose a neutral prompt that would place stars on a red carpet. In other words, a well-trodden public arena where actors are in control of how they look. Some prompts involved subjects who have been vocal critics of generative AI, both female and male. Those who have spoken about the human cost of exploitative technology should, in theory, give pause to AI propagators using their likeness.

Musk’s Grok feature also comes at a time when there is a loud entertainment industry debate about AI following the 2023 Hollywood strikes, while President Donald Trump recently signed the Take It Down Act, legislation that introduced criminal penalties for distributing non-consensual intimate imagery. Vocal advocates of the law include Melania Trump, who has not been immune from Grok’s deepfakes, as Gizmodo observed.

You May Also Like

0 Comments