Can Taylor Swift Save Humanity From AI’s Dark Side?
The singer’s clout may finally tip the scales toward effective laws and spotlight how generative AI is doing more harm than good.
(Bloomberg Opinion) -- The recurring story of new technology is unintended consequences. Take AI-powered image generators. Their creators have claimed they are enhancing human imagination and making everyone an artist, but they often fail to mention how much they’re helping to create illicit deepfake pornography too. Lots of it. Over the weekend, X had to shut down searches for “Taylor Swift” because the site formerly known as Twitter had been flooded with so many faked porn images of the singer that it couldn’t weed them all out. One image alone was viewed more than 45 million times before being taken down. Swift’s scandal points to a broader problem: Around 96% of deepfakes on the web are pornographic. But it could also be the final tipping point before some genuine solutions are introduced.
Enough has happened in January alone to show that in the absence of proper regulation, the harms of generative AI are starting to outweigh the benefits. The technology is being used in more scams and bank frauds, it’s making Google search results worse and it’s duping voters with fake robocalls from President Joe Biden. But Swift’s attack shows where generative AI’s toxic effects are most insidious, by creating whole new groups of both victims and abusers in a marketplace for unauthorized, sexualized images. They point to the quieter but no-less damaging way that generative AI has been undermining the dignity of women, churning out images that are sexualized by default, for instance, with the situation worse for women of color.
Deepfakes epitomize the problem, and until Swift, it had been flying under the radar: High school students over the past year have used real photos of their female classmates to create deepfake porn. In one small town in Spain, a group of boys used AI tools to digitally “undress” social media images of more than 20 girls aged between 11 and 17, before distributing them on WhatsApp and Telegram. Fake porn has been possible for more than two decades thanks to software like Photoshop, but only now has it become so quick and easy for anyone to produce, with different apps making it possible to swap one person’s face onto another body, for instance.
But more people and authorities are taking notice now that the latest victim is Swift, the magazine Time’s Person of the Year who helped add 0.5 percentage points to US GDP, who went to war with streaming services from companies like Apple Inc. and won, and who boosted football’s female viewership. “We are alarmed by the reports of the circulation of false images,” White House Press Secretary Karine Jean-Pierre said Friday. “We are going to do what we can to deal with this issue.”
Lawmakers are up in arms, while Swift’s legions of fans got the phrase “protect Taylor Swift” trending on X, and some of them resorted to vigilante justice, digging out a Twitter user reportedly behind many of the illicit images. Swift is said to be considering legal action against a deepfake porn site that published some of them, according to The Daily Mail.
Swift isn’t one to do things in half measures, so we may see more than just a lawsuit. Perhaps she will put her weight behind some of the bills already making their way through Congress that tackle unauthorized deepfakes. One bill makes it illegal to create and share such images, while another proposes five years in prison for perpetrators, as well as legal recourse for victims.
It’s easy to shrug and argue that the cat is out of the bag. The tools have proliferated; some are open source and as social media networks like Twitter have cut back on their trust and safety teams, it’s very possible to make them go viral. When Microsoft Chief Executive Officer Satya Nadella was recently asked about the Swift deepfakes and jumped straight into platitudes about “guardrails” — rather than make any specific policy recommendations — that may have been because his firm is at the heart of today’s booming generative AI business. At least some of the Swift likenesses were also created on an image generator from Microsoft, according to 404 Media.
There is hope for a solution. Some of the measures going through Congress are a start, and while the long-term rules are still being ironed out, authorities can get a handle on the situation for now by making examples of some of the worst perpetrators. Deterrents can work, even for people who think they can hide behind the cloak of online anonymity. A prime example is the online hacktivist group Anonymous, whose activities died down almost immediately after a handful of its most well-known hackers were arrested and named about a decade ago. One Twitter user has already admitted to posting some of the first images that went viral, saying “Bro what have I done… They might pass new laws because of my Taylor Swift post,” before his account was set to private, according to Newsweek.
Well before Swift became a victim, many young women who didn’t have the same kind of influence were experiencing the psychological distress of being targeted. They’ve had to pick up the pieces after watching their reputations get humiliatingly tarnished online and suffer the long-term consequences. When it’s unauthorized, deepfake porn is no joke — it’s a form of digital sexual violence that threatens to fuel a broader culture of misogyny and abuse online. Perhaps more than any other woman on Earth, Swift has the clout to help make it stop. Here’s hoping she makes the most of the opportunity.
More From Bloomberg Opinion:
- Apple Is Our Hope for Making AI More Private: Parmy Olson
- Not Even Taylor Swift Can Bring Back CDs: Bobby Ghosh
- AI Is Much Better at Evolution Than Revolution: Dave Lee
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Parmy Olson is a Bloomberg Opinion columnist covering technology. A former reporter for the Wall Street Journal and Forbes, she is author of “We Are Anonymous.”
More stories like this are available on bloomberg.com/opinion
©2024 Bloomberg L.P.