Apple has announced rolling out of a system to detect Child Sexual Abuse Material (CSAM) from iPhones of its US customers. The new versions of iOS and iPadOS, slated to be released later this year are expected to have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy".
Before an image is stored onto iCloud Photos, Apple said that the technology will search for matches of already known CSAM from images compiled by the the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations. If a match is found, then a human reviewer will assess and report the user to law enforcement.
However , privacy concerns have been raised that the technology may be used by authoritarian governments to spy on its citizens or expanded to scan phones for prohibited content, political speeches, etc.
Register Email now for Weekly Promotion Stock
100% free, Unsubscribe any time!Add 1: Room 605 6/F FA YUEN Commercial Building, 75-77 FA YUEN Street, Mongkok KL, HongKong Add 2: Room 405, Building E, MeiDu Building, Gong Shu District, Hangzhou City, Zhejiang Province, China
Whatsapp/Tel: +8618057156223 Tel: 0086 571 86729517 Tel in HK: 00852 66181601
Email: [email protected]