Headlines

Google and Bing put nonconsensual deepfake porn at the top of some search results

hqdefault

Google and other search engines include nonconsensual deepfake porn in top image search results alongside tools that advertise the ability to create such material.

With just a click, you can access nonconsensual deepfake pornography on well-known search engines like Microsoft’s Bing and Google.

 

WhatsApp Channel New Join Now
WhatsApp Groups Join Now
Telegram Channel Follow Us

Deepfake pornography frequently involves inserting a person’s face into an actual pornographic scene. For instance, a famous woman’s face may be “swapped” with an adult star’s, giving the impression that the famous woman is performing a sexual act or is nude.

image 15

According to NBC News, the first images were deepfake pornographic pictures with female celebrities’ likenesses. Searches for several women’s names, the word “deepfakes,” and broad phrases like “deepfake porn” or “fake nudes” brought up Google and other popular search engines. Safe-search features were not used during the searches.

NBC News used a name and the term “deepfakes” to search Google and Bing for 36 well-known female celebrities. Upon reviewing the results, it was discovered that the top Google and top Bing results for 34 and 35 of those searches, respectively, contained links to deepfake videos and nonconsensual deepfake images. Links to a well-known deepfake website or a rival website accounted for more than half of the top results. A market for nonconsensual deepfake pornography of celebrities and private individuals has been developed by the well-known deepfake website.

The first six results of a Google search for “fake nudes” included links to several apps and programs that allow users to create and watch nonconsensual deepfake porn. Six articles about high school boys allegedly using technology to create and share deepfake nude images of their female classmates then appeared. Before coming up with an article about the dangers of the phenomenon, a search for “fake nudes” on Bing produced dozens of results containing websites and nonconsensual deepfake tools.

image 14

Copilot, an AI chatbot from Bing that shows up as a search result option, informs users that it is unable to display deepfake porn. “The use of deepfakes is unethical and can have serious consequences,” Copilot states. However, a single click on Bing will provide you with dozens of links and examples of nonconsensual deepfake porn.

The results highlight the increasing incidence of nonconsensual deepfake pornography and raise concerns about the measures taken by tech companies to curb its dissemination, despite their heavy investment in AI projects such as Google’s Colab and Bard, as well as OpenAI’s products, in which Microsoft holds a significant stake. Experts claim that because Google and other search engines don’t actively monitor abuse, they are valuable resources for those wishing to launch deepfake harassment campaigns.

image 13

Although Google stated that its search features, such as panels with selected information, do not permit manipulated media or sexually explicit content, its core web results do not have policies regarding AI-generated content. “Apps determined to promote or perpetuate demonstrably misleading or deceptive imagery, videos, and/or text” are prohibited in the Google Play app store. Nevertheless, an app that previously promoted the production of pornographic deepfakes is still available in Google Play Store.

Although deepfake victims can use a form to request that such content be removed from search results, Google does not actively look for or delist deepfakes. “We only review the URLs that you or your authorized representative submit in the form,” the takedown request page states.

“We understand how distressing this content can be for people affected by it, and we’re actively working to bring more protections to Search,” a Google spokesperson said in a statement. Google indexes content from the internet, just like any other search engine does. However, we deliberately create our ranking systems to prevent users from being startled by unexpectedly offensive or explicit content that they aren’t looking for.

The statement went on, “As this space develops, we’re working to implement more comprehensive safeguards, with a special emphasis on eliminating the need for known victims to submit individual content removal requests.”

Although Google is the most popular search engine, other search engines such as Bing from Microsoft and DuckDuckGo, an independent search engine, also display fictitious images of attractive women in their search results. Fake nude pictures of former teen Disney Channel female actors are among the content displayed on the top image search results for Bing. Some of the images use pictures of the actors’ faces that seem to have been taken before they turned 18. (When searching for some of the faces in some of the pictures in reverse, results surfaced that were uploaded to the internet before the actors turned 18.)

Additionally, a Microsoft representative provided a link to a form that allows victims of nonconsensual intimate imagery (NCII) to report instances of it showing up in Bing search results. Microsoft made it clear in August 2023 that its NCII policy applies to sexually explicit deepfakes.

“The distribution of non-consensual intimate imagery (NCII) is a gross violation of personal privacy and dignity with devastating effects for victims,” a Microsoft spokesperson stated in a statement. Microsoft forbids the use of NCII on its platforms and services, including the solicitation of NCII and the promotion of the creation or dissemination of private images without the consent of the victim.”

 

WhatsApp Channel New Join Now
WhatsApp Groups Join Now
Telegram Channel Follow Us

Leave a Reply

Your email address will not be published. Required fields are marked *