Apple defended its new system that will scan iCloud for illegal child sexual abuse materials, or CSAM, on Monday amid a controversy over whether the system reduces Apple user privacy and could be used by governments to surveil citizens.
Apple reiterated on Monday that its system is more private than those used by companies such as Google and Microsoft because its system uses both its servers and software that will be installed on people’s iPhones through an iOS update.
“Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups,” Apple said in the document. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.”
It continued: “Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”
Apple CEO Tim Cook has previously said that the company follows laws in every country where it conducts business.
Companies in the U.S. are required to report CSAM to the National Center for Missing & Exploited Children and face fines up to $300,000 when they discover illegal images and don’t report them. Critics can be found everywhere and one of them even said.
“It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours,” technology commentator Ben Thompson wrote in a newsletter on Monday.