Getty Images

Apple's fight against child abuse prompts data privacy concerns

Apple is launching features to detect child abuse images in photos and messages. But data privacy experts have some concerns about how else the technology might be used.

Apple is implementing new technology to combat the spread of child abuse images that has prompted data privacy concerns.

Apple plans to introduce new child safety features to limit the spread of child sexual abuse material, the company said in an Aug. 5 announcement.

First, Apple will enable its Messages app to use on-device machine learning to warn children and parents overseeing iOS use about sensitive content, although Apple said private communications will be unreadable by the company. Second, iOS and iPadOS will use cryptography applications to detect collections of child abuse images in iCloud photos and alert law enforcement. Third, Siri and Apple Search will "intervene" when a user tries to search for child abuse content. The new features will arrive later this year in updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey, according to Apple. Apple did not say whether users can easily opt out of the new features.

Data privacy at risk

Since Apple made the announcement, data privacy experts like Matthew Greene, associate professor at the Johns Hopkins Information Security Institute, have sounded alarm bells.

On Twitter and in a New York Times essay, Greene acknowledged that the tools are part of an effort to stop the spread of child abuse images but stressed that they could also open the door to greater surveillance of personal data.

Steven Murdoch, professor of security engineering at the University College London, condemned the new features, tweeting that there is a "risk that Apple's child-abuse image detection system could be extended to other content, regardless of what Apple would like to happen."

Dan Clarke, president of products and solutions at technology and data privacy compliance company IntraEdge, also sees the tools as a double-edged sword, although his concerns are less about Apple's policing and more about how the tools could be manipulated by others.

He said he believes Apple has managed to strike a "pretty good balance between a noble cause and privacy."

"The way the feature works, as I understand it, is it would only look for known images, ones that had been tagged by a human being to say, 'That's abusive,'" Clarke said. "You don't want to have a million pictures of your grandchild taking a bath innocently that somehow triggers this. So they've tried to address it."

Clarke is concerned about how the tools, themselves, might be abused by users.

It looks to me like they've put all the right restrictions in place. But what about a bad actor? What about someone else who decides they want to use this technology?
Dan ClarkePresident of products and solutions, IntraEdge

"It looks to me like they've put all the right restrictions in place. But what about a bad actor? What about someone else who decides they want to use this technology?" Clarke said. "That's where I think you start to get real concerns that are raised."

On its website, Apple includes documents explaining the effort and the technology used to implement it, including data privacy and security measures the company is taking. One of the methods used to protect user data privacy is through searching only for known child abuse images -- information provided by child safety organizations like the National Center for Missing and Exploited Children.

Also this week

  • The Competition and Markets Authority, the U.K.'s competition regulator, provisionally found that Facebook's acquisition of Giphy for about $400 million in 2020 will harm competition and should be unwound. The CMA is now accepting responses from interested parties. Its final report is due Oct. 6.
  • S. Senators Richard Blumenthal (D-Conn.), Marsha Blackburn (R-Tenn.) and Amy Klobuchar (D-Minn.) introduced a bill to protect competition and enhance consumer protection in app stores. The Open App Markets Act specifically targets Google and Apple, companies that "have gatekeeper control" of their mobile operating systems and app stores, allowing them to dictate terms of use, according to a press release.
  • Two members of the House introduced a companion bill to the Open App Markets Act. The proposed legislation, presented by Ken Buck, R-Colo., and Hank Johnson, D-Ga., is aimed at changing how app stores owned by big tech companies like Google and Apple operate.
  • A sweeping $3.5 trillion budget resolution passed the Senate on Wednesday after being approved by Democratic members. It will now go before the House. The budget resolution allots funding to enact President Joe Biden's "Build Back Better" agenda, which includes clean energy and technology investments.

Elsewhere

  • On Monday, India's Supreme Court ruled that Amazon.com Inc. and Walmart's Flipkart will face antitrust investigations ordered against them by the Competition Commission of India last year, according to Reuters. The investigation studied the companies allegedly promoting some sellers on their e-commerce platforms and stifling competition.
  • China plans to draft new laws on monopolies, technology innovation and national security, according to a document published Wednesday by Chinese leadership, Reuters said. China has increasingly cracked down on its own tech giants, including Alibaba and recently Didi, for alleged monopolistic behavior as well as data privacy concerns.

Makenzie Holland is a news writer covering big tech and federal regulation. Prior to joining TechTarget, she was a general reporter for the Wilmington StarNews and a crime and education reporter at the Wabash Plain Dealer.

Next Steps

Senators push for more online child privacy protections

Dig Deeper on CIO strategy

Cloud Computing
Mobile Computing
Data Center
Sustainability and ESG
Close