Deepfake explicit images of Taylor Swift spread on social media. Her fans are fighting back


PTI, Jan 27, 2024, 9:00 AM IST

A scourge of pornographic deepfake images generated by artificial intelligence and sexualizing people without their consent has hit its most famous victim, singer Taylor Swift, drawing attention to a problem that tech platforms and anti-abuse groups have struggled to solve.

Sexually explicit and abusive fake images of Swift began circulating widely this week on the social media platform X. Her ardent fanbase of ”Swifties” quickly mobilized, launching a counteroffensive on the platform formerly known as Twitter and a #ProtectTaylorSwift hashtag to flood it with more positive images of the pop star. Some said they were reporting accounts that were sharing the deepfakes.

The deepfake-detecting group Reality Defender said it tracked a deluge of nonconsensual pornographic material depicting Swift, particularly on X. Some images also made their way to Meta-owned Facebook and other social media platforms. “Unfortunately, they spread to millions and millions of users by the time that some of them were taken down,” said Mason Allen, Reality Defender’s head of growth. The researchers found at least a couple dozen unique AI-generated images. The most widely shared were football-related, showing a painted or bloodied Swift that objectified her and in some cases inflicted violent harm on her deepfake persona.

Researchers have said the number of explicit deepfakes have grown in the past few years, as the technology used to produce such images has become more accessible and easier to use. In 2019, a report released by the AI firm DeepTrace Labs showed these images were overwhelmingly weaponized against women. Most of the victims, it said, were Hollywood actors and South Korean K-pop singers.

When reached for comment on the fake images of Swift, X directed the AP to a post from its safety account that said the company strictly prohibits the sharing of non-consensual nude images on its platform.

“Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them,” the company wrote in the X post early Friday morning. “We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed.” Meanwhile, Meta said in a statement that it strongly condemns “the content that has appeared across different internet services” and has worked to remove it. “We continue to monitor our platforms for this violating content and will take appropriate action as needed,” the company said. A representative for Swift didn’t immediately respond to a request for comment Friday.

Allen said the researchers are 90% confident that the images were created by diffusion models, which are a type of generative artificial intelligence model that can produce new and photorealistic images from written prompts. The most widely known are Stable Diffusion, Midjourney and OpenAI’s DALL-E. Allen’s group didn’t try to determine the provenance.

Federal lawmakers who have introduced bills to put more restrictions or criminalise deepfake porn indicated the incident shows why the U.S. needs to implement better protections. “For years, women have been victims of non-consensual deepfakes, so what happened to Taylor Swift is more common than most people realize,” said U.S. Rep. Yvette D. Clarke, a Democrat from New York who’s introduced legislation would require creators to digitally watermark deepfake content.

“Generative-AI is helping create better deepfakes at a fraction of the cost,” Clarke said. US Rep. Joe Morelle, another New York Democrat pushing a bill that would criminalize sharing deepfake porn online, said what happened to Swift was disturbing and has become more and more pervasive across the internet.

“The images may be fake, but their impacts are very real,” Morelle said in a statement. “Deepfakes are happening every day to women everywhere in our increasingly digital world, and it’s time to put a stop to them.”

Udayavani is now on Telegram. Click here to join our channel and stay updated with the latest news.

Top News

Actor-singer held with MDMA, ganja

Truth coming out: PM Modi on movie on Godhra train burning

Only ineligible BPL cards will be weeded out, no impact on eligible cardholders: K’taka CM

‘Kantara: Chapter 1’ to release on October 2, 2025

Raj Thackeray may play key role after poll results in Maharashtra: Bala Nandgaonkar

Infant kidnapped from Delhi Hospital rescued from rail station in UP; 2 nabbed

KL Rahul bats at nets to allay injury worries; Devdutt Padikkal to stay back in Australia as back-up

Related Articles More

Actress Kasthuri produced in court, lodged in jail

‘Kantara: Chapter 1’ to release on October 2, 2025

National award-winning child actor Master Rohit injured in road accident

Nayanthara slams Dhanush for sending Rs 10 crore lawsuit over her documentary: All-time low for you

No songs promoting drugs, violence at concert: Diljit Dosanjh gets notice from Telangana government

MUST WATCH

Swimming pool

| ₹50 LAKH SEIZED FROM TIRE |

New Technology In Kambala

Lakshdeepotsava 2024 Shree Krishna Mutt

Punganur Cow


Latest Additions

Sitharaman responds to X user seeking relief for middle class

Actor-singer held with MDMA, ganja

Maharashtra polls: Rajasthan CM says Congress’ indulges in politics of ‘jhoot and loot’

Shreyas Iyer named captain, Prithvi Shaw included in Mumbai squad for Syed Mushtaq Ali Trophy

Infant kidnapped from Delhi Hospital rescued from rail station in UP; 2 nabbed

Thanks for visiting Udayavani

You seem to have an Ad Blocker on.
To continue reading, please turn it off or whitelist Udayavani.