ByteZest

Opinion: Taylor Swift deepfake photos on Twitter are a haunting reality we were already warned about

Taylor Swift being the victim of cruel AI images on Twitter, explicit in nature, is just the tip of the dangerous iceberg that is threatening our views in society and safety online.

The celebrity was subject to vile doctored photos storming the social media platform last week. The singer, a woman, was reduced to objectification for no other reason than her being recognized around the world. But the message the Twitter meltdown screams is about something much larger.

Protect Taylor Swift – but what about the other victims?

After the artificial intelligence (AI) images of Taylor Swift appeared on Twitter, her loyal fans went to work. They were busy reporting them and even had the hashtag Protect Taylor Swift trending – and rightly so. But not all celebrities, nor the average person on the street, has an army of devoted listeners to come to our call if we need it.

The only reason these images were made in the first place is because she’s a household name. A hard-working, talented individual who rose to the top by working smarter and harder. She was named as Time’s Person of the Year in December, and consistently storms the Billboard Charts. Even universities have created degrees and courses where students can study the Taylor effect.

Her relationship with Travis Kelce has elevated her fame and news headlines, but it doesn’t mean she should be a target. The images were revolving around her and the Kansas City Chiefs, therefore her latest serious relationship is somehow given people the belief they have permission to do this.

And if someone as famous and wealthy as Taylor Swift can be a victim of AI photos, what does that say about women trying to survive their daily lives? Without a legal team and resources or influence at their fingertips? By targeting someone at the top, it shows how people who shared, commented and made these vile photos, feel about anyone below her.

‘Twitter dropped the ball a long time ago’

The horrible photos last week caused a big stir, with the platform, which Elon Musk bought in 2020 for $44 billion, temporarily shutting down the ability to search her name. But with the billionaire firing more than 80% of the staff since his takeover, that means there are fewer employees on the team that cracks down on this vile images. Perhaps if the workforce didn’t see almost 6,500 people lose their jobs, the sheer amount of AI photos of Taylor Swift wouldn’t have even reached the Twitter sphere – but it did.

It’s believed some of the mass firings including large teams responsible for removing dangerous and harmful content, something which Taylor Swift would have benefited from if there were still there when the AI deepfakes went viral.

The platform says it “will continue to be vigilant for any attempt to spread this content and will remove it if we find it,” Joe Benarroch, said in a statement earlier this week. How did one vile image, which was seen 47 million times before the account was suspended, get through the net?

Even the White House got involved and spoke about the incident. They called the fake photos “alarming”, stating social media companies have a responsibility to prevent misinformation.

Not only that, but Musk declares himself a champion of free speech – but how far does this go?

Last year, after his rant about free speech, it was feared hate speech and disinformation would go viral, and look what has happened. After introducing a paid plan for verified accounts, anyone can buy a Blue Tick which used to represent official figures, governments and bona fide journalists.

Now, days after the cyberattack on the music star, Twitter (now called X), is allowing people to search for Taylor Swift, after limiting the option to stop the spread of the AI images. Do we just count down the days until it happens again?

Twitter’s AI images ‘want to put Taylor Swift back in her box’

Campaigner and author Laura Bates, who wrote the book, Men Who Hate Women, has been subject to this as well. She says she’s received deepfake images from men that they edited with her face on it to look like she was performing sex acts on them.

“There’s something really visceral about seeing an incredibly hyper-realistic image of yourself in somebody’s extreme misogynistic fantasy of you,” she told The Guardian. “There’s something really degrading about that, very humiliating. It stays with you.”

She also weighed in on Taylor Swift being another victim of AI on Twitter. She said it’s “just the new way of controlling women. You take somebody like Swift, who is extraordinarily successful and powerful, and it’s a way of putting her back in her box. It’s a way of saying to any woman: it doesn’t matter who you are, how powerful you are – we can reduce you to a sex object and there’s nothing you can do about it.”

Reports last week claimed the celebrity was considering ‘legal action’ against the site that published the photos. But Taylor herself, nor her representatives, have spoken out about it.

Celebrity Tidbit and GRV Media have approached Taylor Swift for comment.

New bill is trying to stop AI images like Taylor Swift

Now, lawmakers in the US are proposing to allow victims to sue over fake AI images and deepfake photos, but the law isn’t in place now for people like Taylor Swift.

The DEFIANCE Act (Disrupt Explicit Forged Images and Non-Consensual Edits) would add a civil right of action allowing victims to collect financial damages, from anyone who “knowingly produced or possessed” the photo with intent to spread it.

Senate Majority Whip Dick Durbin introduced the bill, joined by other lawmakers Sens. Lindsey Graham, Amy Klobuchar and Josh Hawley. It follows on the Violence Against Woman Act Reauthorization Act of 2022, which revolves around non-fake images. The proposal even references Taylor Swift’s AI case on Twitter, saying the deepfakes can be “used to exploit and harass women – particularly public figures, politicians and celebrities”.

Although, the bill still faces an uphill battle to be passed. No AI and non-consensual bills have yet passed.

With the talk of technology and AI being at the forefront of a lot of people’s minds over the past year or two, why are only taking action now? It seems lawmakers have had time to address potential risks before this had to happen.

Taylor Swift, and women around the globe, have already been let down.

Related Topics

MORE CELEBRITY STORIES

ncG1vNJzZmivp6x7pLHLnpmroaSuwaqwwaKrZ5ufonxzfJFtZmlpX2h%2BcLvPoqWip55iwaLFy6ipZqunnrO1ecOenKmekaCybrzHqKuoq12ku27A1qKrrZ2iYq6zsYyaZKGZpaPBqrrGZqmemZyewbp51p5ksJ2imnqiuNGemJ2xXayus7rEnWSamp%2BqwXA%3D

Jenniffer Sheldon

Update: 2024-03-17