Register now for better personalized quote!

Biden administration urges action against AI-generated sexual abuse images

May, 24, 2024 Hi-network.com

The Biden administration is urging tech and financial industries to combat the proliferation of AI-generated sexual abuse images, the Time reports. Generative AI tools have made it easy to create explicit deepfakes, often targeting women, children, and LGBTQ+ individuals, with little recourse for the victims. The White House calls for voluntary cooperation from companies to implement measures to stop these nonconsensual images, while no federal legislation addresses the issue.

Biden's chief science adviser, Arati Prabhakar, noted the rapid increase in such abusive content and the need for companies to take responsibility. A document shared with the Associated Press outlines actions for various stakeholders, including AI developers, financial institutions, cloud providers, and app store gatekeepers, to restrict the monetisation and distribution of explicit images, particularly those involving minors. The administration also stressed the importance of stronger enforcement of terms of service and better mechanisms for victims to remove nonconsensual images online.

Why does it matter?

Last summer, major tech companies committed to AI safeguards, followed by an executive order from Biden to ensure AI development prioritises public safety, including measures to detect AI-generated child abuse imagery. However, high-profile incidents, like AI-generated deepfakes of Taylor Swift and the rise of such images in schools, reveal and urgent need for action and the potential insufficiency of voluntary commitments from companies. Recently, Forbes reported that AI-generated images of young girls in provocative outfits are spreading on TikTok and Instagram, drawing inappropriate comments from older men and raising concerns about potential exploitation.

tag-icon Hot Tags : Artificial Intelligence Content policy Human rights

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.