News24com Apple delays launch of feature that would scan photos for child abuse
Apple is delaying a system that would have scanned customersâ photos for signs of child sex abuse after fierce criticism from privacy advocates, who feared it could set the stage for other forms of tracking.
The company had announced the feature in early August, along with other tools meant to protect children and root out illicit pornography, and quickly faced concerns that it would create a backdoor through the companyâs highly prized privacy measures.
Apple scrambled to contain the controversy in the following weeks, saying it would tap an independent auditor to oversee the system, but the outcry persisted.
âLast month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,â Apple said in a statement Friday.
âBased on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.â
Growing scrutiny
The backlash has added to growing scrutiny of Apple in recent months. Earlier this week, the company agreed to change its App Store policies to address criticism that itâs anti-competitive. And employees have become increasingly vocal about problems within the company, including what they say is a lack of pay equity. The US National Labor Relations Board is currently looking into two complaints from workers that originated with concerns about workplace safety and a lack of pay transparency.
Apple had planned a trio of new tools designed to help fight child sex abuse material, or CSAM. They included using the Siri digital assistant for reporting child abuse and accessing resources related to CSAM, as well as a feature in Messages that would scan devices operated by children for incoming or outgoing explicit images.
The third feature was the most controversial: one that would analyse a userâs library in iCloud Photos for explicit images of children. If a customer was found to have such pictures in their library, Apple would be alerted, conduct a human review to verify the contents, and then report the user to law enforcement.
Privacy advocates such as the Electronic Frontier Foundation warned that the technology could be used to track things other than child pornography, opening the door to âbroader abuses.â They werenât assuaged by Appleâs plan to bring in the auditor and fine-tune the system, saying the approach itself canât help but undermine the encryption that protects usersâ privacy.
On Friday, the EFF said it was pleased that Apple was listening to the concerns, but it urged the company to drop the plan altogether.
âThese features would create an enormous danger to iPhone usersâ privacy and security, offering authoritarian governments a turnkey mass surveillance system to spy on citizens,â Executive Director Cindy Cohn said in a statement.
âThe enormous coalition that has spoken out will continue to demand that user phones -- both their messages and their photos -- be protected, and that the company maintain its promise to provide real privacy to its users.â
In its attempts to defend the new CSAM feature, Apple coached staff about how to field questions on the topic. It also said that the system would only flag cases where users had about 30 or more potentially illicit pictures.
Protection services
Apple also is far from alone in taking such steps. Facebook has long had algorithms to detect such images uploaded to its social networks, and Googleâs YouTube analyses videos on its service for explicit or abusive content involving children. Adobe has similar protections for its online services.
Appleâs CSAM feature would work by assigning a so-called hash key to each of the userâs images and comparing the keys with ones assigned to images within a database of explicit material. Some users have been concerned that they may be implicated for simply storing images of, say, their baby in a bathtub. But a parentâs personal images of their children are unlikely to be in a database of known child pornography, which Apple would cross-reference as part of its system.
Apple also tried to tamp down concerns about governments spying on users or tracking photos that arenât child pornography. It said its database would be made up of images sourced from multiple child-safety organizations - not just the National Center for Missing & Exploited Children, as was initially announced. The company also plans to use data from groups in regions operated by different governments and said the independent auditor will verify the contents of its database.
It also only affects photos that customers upload to their iCloud accounts. Apple has said it would refuse any requests from governments to use its technology as a means to spy on customers.
The feature had been slated to go in effect before the end of the year, potentially overshadowing a flurry of Apple product announcements that are expected in the coming weeks. The company is rolling out updated iPhones, iPads, AirPods and Macs, as well as a new larger Apple Watch, people familiar with the matter have said.
The EFFâs Cohn said Friday she was looking forward to working with the company âto find ways to fight the scourge of child sexual abuse material online that that doesnât require the sacrifice of real privacy and security for Appleâs customers.â
0 Response to "News24com Apple delays launch of feature that would scan photos for child abuse"
Post a Comment