Taylor Swift no longer searchable on X amid scandal over graphic AI photos

Look what they made X do.

Earlier in the week, graphic AI images of Taylor Swift went viral on the social media platform and now, X users are being met with an error message upon searching her name.

After we discovered the issue, X told Page Six, “This is a temporary action and done with an abundance of caution as we prioritize safety on this issue.”

Renderings featuring the 34-year-old billionaire flooded X this past week, showing the “Cruel Summer” songstress in various sexual scenarios at boyfriend Travis Kelce’s Kansas City Chiefs game.

As soon as the images started going viral, the singer’s diehard fans came to her defense, begging people not to share them.

Swift is reportedly “furious,” too, and she’s considering taking legal action, a source told the Daily Mail Thursday.

When fans went to search for the singer on the social media app over the weekend, they were met with an error message. Twitter/X

“Whether or not legal action will be taken is being decided, but there is one thing that is clear: These fake, AI-generated images are abusive, offensive, exploitative and done without Taylor’s consent and/or knowledge,” the insider said.

“The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with,” the source added.


For more Page Six you love ...


“Legislation needs to be passed to prevent this, and laws must be enacted.”

The change on X came just days after fake and provocative AI images of the singer went viral online. Getty Images
In a number of the “deepfakes,” Swift could be seen in sexual scenarios at boyfriend Travis Kelce’s Kansas City Chiefs game. GC Images

The 34-year-old singer has not yet publicly addressed the scandal, but her fans have been flooding X with positive messages about her in an attempt to fight back against the images known as “deepfakes.”

“people sharing the ai pics are sick and disgusting. protect taylor swift at all costs,” one fan tweeted about the images that show the singer in provocative poses.

“using ai generated pornography of someone is awful and inexcusable. you guys need to be put in jail,” another fan wrote of the person or persons behind the offensive snaps.

Following the release of the photos, a report claimed Swift may take legal action. AP
The singer is reportedly “furious.” GC Images

“Protect Taylor Swift” trended on the platform Thursday morning.

The White House also spoke out after the scandal, calling for legislation to protect victims of online harassment.

White House press secretary Karine Jean-Pierre said the incident was “alarming,” and that the Biden administration is focusing on the negatives of AI.

“Of course Congress should take legislative action,” Jean-Pierre said, according to The Verge. “That’s how you deal with some of these issues.”

Finally, SAG-AFTRA released a statement on the situation, writing, “The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal. As a society, we have it in our power to control these technologies, but we must act now before it is too late.”

The actor’s union said the images are “upsetting, harmful, and deeply concerning.”

Page Six reached out to Swift’s team, but we did not receive an immediate response. The X account that first shared the AI images has since been made private.