National

Apple’s iPhones will include new tools to flag child sexual abuse

Apple said it had designed the new features in a way that protects the privacy of users.

In an undated image provided by Apple, a feature by Apple that parents can use to flag when their children send or receive nude photos in text messages. (Apple via The New York Times


Apple recently unveiled changes to iPhones designed to catch cases of child sexual abuse, a move likely to please parents and police but that was worrying privacy watchdogs.

Later this year, iPhones will begin using complex technology to spot images of child sexual abuse, commonly known as child pornography, that users upload to Apple’s iCloud storage service, the company said. Apple also said it would soon let parents turn on a feature that can flag when their children send or receive nude photos in a text message.

Apple said it had designed the new features in a way that protects the privacy of users. The scanning is done on the child’s device, and the notifications are only sent to parents’ devices.

Advertisement:

Matthew Green, a cryptography professor at Johns Hopkins University, said Apple’s new features set a dangerous precedent by creating surveillance technology that law enforcement or governments could exploit.

To spot child sexual abuse material, or CSAM, uploaded to iCloud, iPhones will use technology called image hashes, Apple said. The software boils a photo down to a unique set of numbers — a sort of image fingerprint.

The iPhone operating system will soon store a database of hashes of known child sexual abuse material provided by organizations like the National Center for Missing & Exploited Children, and it will run those hashes against the hashes of each photo in a user’s iCloud to see if there is a match.

Advertisement:

Once there are a certain number of matches, the photos will be shown to an Apple employee to ensure they are indeed images of child sexual abuse. If so, they will be forwarded to the National Center for Missing & Exploited Children, and the user’s iCloud account will be locked.

Apple’s other feature, which scans photos in text messages, will be available only to families with joint Apple iCloud accounts. If parents turn it on, their child’s iPhone will analyze every photo received or sent in a text message to determine if it includes nudity. Nude photos sent to a child will be blurred, and the child will have to choose whether to view them. If children under 13 choose to view or send a nude photo, their parents will be notified.

Advertisement:

This article originally appeared in The New York Times.

Jump To Comments

Conversation

This discussion has ended. Please join elsewhere on Boston.com