
Utah AG calls on search engines, payment platforms, to do more to fight deepfake pornography
Utah Attorney General Derek Brown joined a bipartisan effort this week to combat a growing online threat — deepfake pornography.
Brown, along with attorneys general from 47 other states, penned two letters on Tuesday asking internet search engines and payment platforms to do more to stop what has become a pervasive and humiliating trend.
Also called “computer-generated deepfake nonconsensual intimate imagery,” deepfake pornographic images or videos often manipulate a person’s real picture, sometimes using artificial intelligence, to depict the person in an intimate setting.

© Khanchit Khirisutchalual - iStock-1515913422
It’s a growing problem, Brown’s office said, that can embarrass, intimidate and exploit people, the majority of them women. Celebrities and regular people alike have become victims, including several high-profile cases of teenagers, who say the fake images have devastating impacts on their mental health, job and college prospects and physical safety.
According to data from the Attorney General’s Office, a whopping 98 percent of fake online videos contain deepfake pornographic imagery. The 10 most popular websites dedicated to deepfake pornography have generated more than 300 million views, the office said.
“As this technology becomes more powerful and creates more potential for harm to the public, businesses that help people search for, create, and distribute this content need to be aware of their role in propagating this content and work to prevent its spread,” reads the letter, addressed to Google Search, Microsoft Bing and Yahoo! Search.
Just like search engines limit access to content related to suicide, terrorism and other criminal activity — such as searching “how to build a bomb” — Brown’s office said they can block content related to deepfake pornography. That includes now-common searches like “how to make deepfake pornography,” “undress apps,” “nudify apps” or “deepfake porn.”
In a separate letter to Visa, Mastercard, American Express, PayPal, Google Pay and Apple Pay, the attorneys general urged the payment platforms to do more to identify and remove payment authorization for transactions related to deepfake pornography.
“The same principles that have led payment processors to withdraw their services from sellers who engaged in other harmful activities should also apply to sellers that are distributing deepfake (nonconsensual intimate imagery) tools and content,” the letter reads.
Utah lawmakers in 2024 passed SB66, a simple bill that added “generated” images to the definition of a counterfeit intimate image. Now, anyone who commits unlawful distribution of a counterfeit intimate image could be charged with a class A misdemeanor, or a third-degree felony if they are a repeat offender, punishable by up to one year in jail.