Fake pornographic images of Taylor Swift, created using generative artificial intelligence (AI), have circulated widely on social media in recent days, sparking outrage among the North American political class and the singer’s fans.
One of these images has already been viewed more than 47 million times on the AFP social network).
Fake pornographic images [deepfakes, na expressão em inglês] written by famous women, but also addressed to many anonymous people, are nothing new.
But according to many activists and regulators, the development of generative artificial intelligence programs risks creating an uncontrollable flow of degrading content.
However, the fact that such images this time involve Taylor Swift, the world’s second most-streamed artist on Spotify, could, however, help bring the issue to the attention of authorities given the outrage among her millions of fans.
“The only ‘good thing’ about what happened to Taylor Swift is that she weighs enough that a law will be passed to eliminate it,” Danisha Carter, an influencer with several hundred thousand followers on social media, pointed out to X .
OX is known for having less strict nudity rules than Instagram or Facebook.
Apple and Google have the right to control the content that circulates in apps through the rules they impose on their mobile operating systems, but so far they have tolerated this situation on X.
In the press release, X assured that he has a “zero tolerance policy” regarding the publication of nude images without consent.
The platform said it was “removing all identified images” of the singer and “taking necessary action against the accounts that posted them.”
Representatives of the American singer have not yet commented on this situation.
“What happened to Taylor Swift is not new: women have been the subject of fake images without their consent for many years,” recalls Democratic Congresswoman Yvette Clarke, who supported legislation to combat this phenomenon.
“Thanks to advances in artificial intelligence, creating such images is becoming easier and cheaper,” he recalled.
A 2019 study found that 96% of deepfake videos are pornographic in nature.
According to Wired magazine, 113,000 such videos were uploaded to major porn sites in the first nine months of 2023.
Author: Lusa
Source: CM Jornal

I’m Tifany Hawkins, a professional journalist with years of experience in news reporting. I currently work for a prominent news website and write articles for 24NewsReporters as an author. My primary focus is on economy-related stories, though I am also experienced in several other areas of journalism.