‘Taylor Swift’ searches blocked on X after AI porn outrage
SAN FRANCISCO, United States—Some searches including the terms “Taylor Swift” on X turned up no results on Monday, Jan. 29, after the company, formerly known as Twitter, apparently moved to prevent the spread of AI-generated porn videos using the star’s likeness.
Certain queries attempted by AFP that included the megastar’s name, such as “Taylor Swift AI,” triggered messages saying “something went wrong.”
The Verge and other US media reported that X had put a temporary block on searches using Swift’s name in the wake of criticism by her fans, the White House and others over AI-generated porn depicting her being shared on the platform and elsewhere online.
X did not respond to a query for comment, but The Verge quoted head of business Joe Benarroch as saying the block on Swift searches was a temporary measure intended to “prioritize safety.”
One fake image of the US megastar was seen 47 million times on X before it was removed Thursday. According to US media, the post was live on the platform for around 17 hours.
Article continues after this advertisement“It is alarming,” said White House Press Secretary Karine Jean-Pierre, when asked about the images on Friday.
Article continues after this advertisement“Sadly we know that lack of enforcement (by the tech platforms) disproportionately impacts women and they also impact girls who are the overwhelming targets of online harassment,” Jean-Pierre added.
Deepfake porn images of celebrities are not new, but activists and regulators are worried that easy-to-use tools employing generative artificial intelligence (AI) will create an uncontrollable flood of toxic or harmful content.
The targeting of Swift, one of the world’s top-streamed artists whose latest concert tour propelled her to the top of American fame, could shine a new light on the phenomenon with her legions of fans outraged at the development.
X is one of the biggest platforms for porn content in the world, analysts say, as its policies on nudity are looser than Meta-owned platforms Facebook or Instagram.
In a statement last week, X said that “posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content.”
The Elon Musk-owned platform said that it was “actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”