Apple and Google Remove Deepfake Nude Apps

0 0
Read Time:1 Minute

Apple and Google Ban Deepfake Nude Apps

Apple and Google have recently taken action to remove apps from their respective app stores that promised deepfake nudes, as reported by 404 Media.

Concerns Over Deepfake Nudes

Instances of apps using AI to generate deepfake nudes have raised significant concerns regarding privacy and exploitation. A recent report from 404 Media highlighted how Instagram was hosting ads for such apps, with one ad featuring a picture of Kim Kardashian accompanied by the slogan, “Undress any girl for free. Try It.” These apps allow users to upload real photos of individuals, particularly women, and create fake nude images of them.

This issue gained attention following Meta’s efforts to enhance safety measures on Instagram Messenger to prevent the sharing of explicit images, particularly to safeguard teenage users from potential predators.

Controversy Surrounding AI-Generated Nudes

Deepfake nudes have become a significant challenge amidst the rise of generative AI technology. For instance, images of Taylor Swift created through AI circulated widely earlier this year, sparking numerous reposts and views. Instances of AI-generated fake nudes targeting teen girls in various locations, such as schools in New Jersey, Washington, and California, prompted investigations in late 2023.

Several U.S. states have responded to this issue by introducing legislation to combat the spread of sexually explicit AI-generated content. Moreover, recent incidents involving teenagers in Florida who used “undress” apps to create fake nude images of their classmates resulted in legal action, with the individuals facing serious felony charges.

Legal Actions and Proposed Legislation

While there is currently no federal law specifically addressing AI-generated nudes, U.S. senators took steps to introduce a bill in January allowing victims to take legal action against perpetrators. Senators Richard Durbin and Lindsey Graham emphasized the real-life consequences of distributing sexually explicit deepfakes, stating that victims could potentially lose their jobs and suffer from lasting emotional trauma.

The proposed legislation, known as the DEFIANCE Act, aims to provide recourse for individuals impacted by the harmful dissemination of deepfake content, highlighting the need for comprehensive measures to address this emerging threat.

Image/Photo credit: source url

About Post Author

Chris Jones

Hey there! 👋 I'm Chris, 34 yo from Toronto (CA), I'm a journalist with a PhD in journalism and mass communication. For 5 years, I worked for some local publications as an envoy and reporter. Today, I work as 'content publisher' for InformOverload. 📰🌐 Passionate about global news, I cover a wide range of topics including technology, business, healthcare, sports, finance, and more. If you want to know more or interact with me, visit my social channels, or send me a message.
Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %