early on in 2021 , Apple announced a suite of feature mean to protect children from being sexually exploited online . Some are on - gimmick , like parental control that prevent child accounts from seeing or sending sexual photos in Messages , but the most controversial measure was a system of rules to read photos as they were uploaded to iCloud .
The organisation was meant to protect concealment by only liken unique haschisch of images to see if they match the unique haschisch of known CSAM ( Child Sexual Abuse Material ) . Still , it was roundly criticized by privacy advocator as a system that could be exploited , for example , by nation actors forcing Apple to get epitome of dissidents . Some shaver safety experts also thought the system was n’t full-bodied enough , as it could only match range of a function from a known database and not new created CSAM .
Apple delay that part of its child prophylactic feature , and then last December , confirm that it had quietly killed the undertaking . Instead , Apple said , the society would focalize on safety features that go on - gimmick and protect shaver from predators , rather that developing a organization that scan iCloud images .
Now Apple finds itself defend that decision , retell its premature principle .
A child safety radical call Heat Initiative says that it is organize a movement to pressure Apple to “ notice , report , and absent ” nipper sexual abuse imagery from iCloud . Apple responded to this developmentin a statement to Wired . The troupe essentially made the same argument it did last December : CSAM is awful and must be combated , but scan online pic creates systems that can be abused to violate the secrecy of all drug user .
Scanning every substance abuser ’s privately stored iCloud data point would create young threat vector for data thieves to rule and exploit … It would also come in the potential for a slippery slope of unintended consequences . Scanning for one type of content , for instance , opens the door for bulk surveillance and could make a desire to search other encrypted message systems across content type .
In short , Apple is admitting ( again ) what the privacy advocate community said when the iCloud CSAM scanning feature was first announced : There ’s just no way to make it work without also creating organisation that can jeopardise the safety and privacy of everyone .
This is just the previous crinkle in the age - old encryption disputation . The only room to fully protect users ’ privacy is to encrypt data in a way thatnobodycan “ look into ” it other than the user or their intended recipient role . This protect the innocent and criminals likewise , so it naturally is oppose by law enforcement mathematical group , intelligence bureau , and other organizations who each have their own reasons for want to search through substance abuser data point .
Apple believe that preventing CSAM and other forms of child insult is critically significant , but must be done in a way that does not allow Apple ( or other groups ) any way to view drug user data . On - twist sleuthing and concealment of au naturel imagery is one such feature that Apple has been expand with OS updates over the last couple year .