Register now for better personalized quote!

Apple will not scan iCloud photos for CSAM

Nov, 13, 2023 Hi-network.com

Apple has announced that it has withdrawn its plans to scan photos on users' iCloud for child sexual abuse material (CSAM). Following criticism from civil society and expert communities, in September 2021 Apple paused the rollout of the relevant feature. Now, the company will focus on its Communication Safety feature announced in August 2021, which allows parents and caregivers to opt into protections on the iCloud. Apple is also developing a new feature to detect nudity in videos sent through Messages and will expand this to its other communication applications.

tag-icon Hot Tags : Content policy

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.