Users of X, formerly Twitter, searching for anything related to Taylor Swift are encountering an unexpected roadblock as searches yield no results amidst a flood of explicit AI-generated images that surfaced earlier this week, depicting the singer alongside Kansas City Chiefs fans.
Search terms like “Taylor Swift nude” or “Taylor Swift AI” are also proving fruitless, signaling a comprehensive block on related content.
However, more innocuous phrases such as “Taylor Swift singer” still return some results. This sudden blockage follows the White House’s statement addressing the explicit AI images, urging Congress to intervene.
Press Secretary Karine Jean-Pierre expressed the administration’s concern over the dissemination of these false images, emphasizing the need for social media platforms to enforce their rules to curb the spread of misinformation and non-consensual intimate imagery.
The administration has taken proactive measures, including establishing a task force to combat online harassment and abuse and launching a national 24/7 helpline for survivors of image-based sexual abuse.
The circulated images depict Swift adorned in red body paint in suggestive poses alongside Kansas City Chiefs fans.
Responding to the controversy, SAG-AFTRA issued a statement condemning the images as “upsetting, harmful, and deeply concerning.”
They called for legislative action, advocating for Congressman Joe Morelle’s Preventing Deepfakes of Intimate Images Act to criminalize the creation and dissemination of such fake images without consent.