Apple Defends Child Pornography Image Matching Tools

(Drew Angerer/Getty Images)

By    |   Friday, 13 August 2021 10:25 PM EDT ET

An executive of Apple on Friday defended its new child pornography image matching system that can identify images of missing and exploited children on Apple devices against a backlash criticizing the violation of users’ privacy.

''It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood,'' Apple Senior Vice President of Software Engineering Craig Federighi said in a Wall Street Journal interview Friday. ''We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing.''

The company announced the new system, designed to recognize and help limit the spread of child sexual abuse material on its system to protect children from online predators that use communication tools to recruit and exploit them.

Its new child safety features were developed ''in collaboration with child safety experts,'' according to the company.

The software enables parents to be more informed in their role helping their children safely navigate online communications, warning users about ''sensitive content,'' while respecting users’ privacy.

It also uses ''applications of cryptography'' to limit the spread of child sexual abuse material online, notifying law enforcement when such items are detected.

According to the Journal story, the company has come under fire amid privacy concerns from users, who worry that the technology could be used by government entities or other bad actors looking to get around the company’s privacy protections.

Federighi said that would not be the case, and that the company can protect private information from such actors with the system’s ''multiple levels of auditability.''

''We, who consider ourselves absolutely leading on privacy, see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world,''  Federighi said during the interview.

He said that individual accounts would have to log 30 or more image matches before it would take any action or report the account holder to authorities.

''If and only if you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images, and at that point, only knows about those images, not about any of your other images,'' Federighi said.

''This isn’t doing some analysis for, did you have a picture of your child in the bathtub? Or, for that matter, did you have a picture of some pornography of any other sort? This is literally only matching on the exact fingerprints of specific known child pornographic images.''

According to the Journal, critics say that instead of pornography, other items, such as political files, could be used to flag individual accounts, something that Federighi denies, saying that the recognition software compares the images with a number of child safety organizations, like the Center for Missing and Exploited Children to ensure it is only used for this one purpose.

© 2025 Newsmax. All rights reserved.


Newsfront
An executive of Apple on Friday defended its new child pornography image matching system that can identify images of missing and exploited children on Apple devices against a backlash criticizing the violation of users' privacy. ''It's really clear a lot of messages got...
apple, internet, children, pornography, crime
489
2021-25-13
Friday, 13 August 2021 10:25 PM
Newsmax Media, Inc.

View on Newsmax