X confirms it blocked Taylor Swift searches to ‘prioritize safety’

X confirms it blocked Taylor Swift searches to ‘prioritize safety’

The move comes after nonconsensual pornographic deepfakes of the singer went viral last week.

X confirms it blocked Taylor Swift searches to ‘prioritize safety’
Reuters / Reuters

X has confirmed it’s preventing users from searching Taylor Swift’s name after pornographic deepfakes of the artist began circulating on the platform this week. Visitors to the site started noticing on Saturday that some searches containing Swift’s name would only return an error message. In a statement to the Wall Street Journal on Saturday night, Joe Benarroch, X’s head of business operations, said, “This is a temporary action and done with an abundance of caution as we prioritize safety on this issue.” This step comes days after the problem first became known.

X’s handling of the issue from the start has drawn criticism that it’s been slow to curb the spread of nonconsensual, sexually explicit images. After the images went viral on Wednesday, Swift’s fans took matters into their own hands to limit their visibility and get them removed, mass-reporting the accounts that shared the images and flooding the hashtags relating to the singer with positive content, NBC News reported earlier this week. Many of the offending accounts were later suspended, but not before they’d been seen in some cases millions of times. The Verge reported on Thursday that one post was viewed more than 45 million times.

In a statement posted on its platform later that day, X said, “Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed. We’re committed to maintaining a safe and respectful environment for all users.”

But it was still possible to find the images in days after. 404Media traced the likely origin of the images to a Telegram group known for creating nonconsensual AI-generated images of women using free tools including Microsoft Designer. In an interview with NBC News’ Lester Holt on Friday, Microsoft CEO Satya Nadella said the issue highlights what is the company’s responsibility, and “all of the guardrails that we need to place around the technology so that there’s more safe content that’s being produced.” He continued to say that “there’s a lot to be done there, and a lot being done there,” but also noted that the company needs to “move fast.”

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics

(13)