Support for Apple’s plan
“It is utterly appalling to know that the sexual abuse of children is incited, organized, and celebrated online. Child abusers share photos and videos of their abhorrent crimes, as well as luring children they find online into sending indecent images of themselves,” Home Secretary Pritio Patel said in a press release. “It is devastating for those it hurts and happens on a vast and growing scale. Last year, global technology companies identified and reported 21 million instances of child sexual abuse.”
He adds that Apple states their child sexual abuse filtering technology has a false positive rate of one in a trillion, meaning the privacy of legitimate users is protected whilst those building huge collections of extreme child sexual abuse material are caught out. Patel wants Apple to continue the project. What’s more, the British government is offering to pay anyone who can find a way “to keep children safe in environments such as online messaging platforms with end-to-end encryption.”
Opposition to Apple’s plan
Apple has announced that it will delay its its controversial CSAM detection system and child safety features. But there’s still a call for the plan to be abandoned entirely.
In response to Apple’s plan to add surveillance features that will scan photos and messages, a group of civil and human rights organizations have delivered petitions with more than 59,796 signatures to the company today.
The petitions call on Apple to abandon its plan, “which goes against the company’s purported commitment to privacy and security, and its history of rejecting backdoors to access content on our phones.” Despite Apple’s announcement to postpone its rollout of the scanning features, civil rights organizations say they will continue to oppose the company’s plan until they fully abandon it, because there is no safe way to conduct on-device content scanning.