Apple’s controversial plan to scan user photos and conversations for child sexual abuse material (CSAM) faces renewed criticism after rights groups warned it would “censor protected speech”, threaten privacy and endanger children. In a letter published on the Center for Democracy and Technology website, a coalition of more than 90 groups from around the world urged Apple CEO Tim Cook to drop plans to introduce the surveillance feature – known as a CSAM hash – to detect child pornographic imagery stored on the iCloud. The letter, published on Thursday, points to the use of “notoriously unreliable” machine learning algorithms to scan for
Hence then, the article about apple s new child safety features face fresh challenge over censorship privacy from over 90 rights groups was published today ( ) and is available onRussia Today ( News ) The editorial team at PressBee has edited and verified it, and it may have been modified, fully republished, or quoted. You can read and follow the updates of this news or article from its original source.
Read More Details Finally We wish PressBee provided you with enough information of ( Apple’s new ‘child-safety’ features face fresh challenge over censorship & privacy from over 90 rights groups )