Joe Benarroch, the head of business operations at X (formerly Twitter), has announced a temporary measure to block searches for Taylor Swift on the platform. This decision comes in response to the circulation of explicit AI-generated images of the singer, which went viral and raised concerns among both US officials and Swift’s fans.
When users attempt to search for Taylor Swift on X, they now encounter a message stating, “Something went wrong. Try reloading.” The move aims to prioritize safety in light of the dissemination of fake graphic images earlier in the week.
Swift’s fans, alarmed by the spread of inappropriate content, took action by flagging posts and accounts sharing the fake images. They flooded the platform with real images and videos of the singer, using the hashtag #ProtectTaylorSwift.
In an official statement released on Friday, X emphasized its zero-tolerance policy towards non-consensual nudity, stating that such content is strictly prohibited on the platform. The company assured that its teams are actively removing identified images and taking appropriate actions against the responsible accounts.
It remains unclear when X initiated the block on searches for Taylor Swift or whether similar measures have been implemented for other public figures or terms in the past.
The situation garnered the attention of the White House, with press secretary Karine Jean-Pierre expressing alarm over the spread of AI-generated photos. She highlighted the need for legislation to address the misuse of AI technology on social media and urged platforms to enforce rules against such content.
US politicians have also called for new laws to criminalize the creation of deepfake images, which use AI to manipulate someone’s face or body in videos. A study in 2023 reported a 550% increase in the creation of doctored images since 2019, driven by advancements in AI.
Currently, there are no federal laws in the US against the sharing or creation of deepfake images, although some states are taking steps to address the issue. In the UK, the sharing of deepfake pornography was made illegal under the Online Safety Act in 2023.