Apple Inc., the multinational technology company responsible for manufacturing the widely popular iPhone, has revealed that it systematically scans photos uploaded to iCloud from iPhones to detect and report images of child sexual abuse. This could have a huge impact on persons whose phones’ images may not show child sex abuse but may be interpreted in that way.
Speaking at the recent CES 2020, a Consumer Electronics Show in Las Vegas, Apple’s Chief Privacy Officer Jane Horvath said the company uses image-matching technology to find suspected images of child exploitation.
New technology can detect child abuse images
The new technology works much like spam filters in email, using electronic signatures to detect child abuse images.
Apple’s specific software hasn’t been revealed. But according to the Daily Mail, it may be similar to PhotoDNA, which was developed by Microsoft.
PhotoDNA checks images against a database of previously identified images by using “hashing.” That means it doesn’t see the image itself but rather the data behind the image.
Henry Farid, who helped develop PhotoDNA, said in an opinion piece in Wired magazine that images in encrypted messages “can be checked against known harmful material without Facebook or anyone else being able to decrypt the image. This analysis provides no information about an image’s contents, preserving privacy, unless it is a known image of child sexual abuse.”
Apple uses encryption — a process of converting information or data into a code to prevent unauthorized access — to protect users’ privacy and security. But Horvath said child abuse, like terrorism, was “abhorrent” and would be treated differently.
“Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled,” Horvath said. Of course, such images could also be reported to authorities.
Yes, Apple can invade your privacy to evaluate your phone’s images and even can provide law enforcement with such information under subpoena. That’s because iPhones users, in effect, agree to such a process by agreeing to the legal terms of iCloud, which state:
“Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may pre-screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.”
Ex-galleria employee charged with sexual exploitation of children
Former Galleria worker reportedly recorded videos of himself sexually assaulting 2 toddlers and posted them on the dark web. Learn about the potential punishments upon conviction.
Misinterpretation can cause great harm
While the goal of this policy may admirably aim to protect society from child sexual abuse, it also opens a door to misinterpretation, which can cause great harm to innocent iPhone users.
As with any evidence used by authorities to pinpoint child sex abuse and make arrests, caution must be applied to avoid snaring innocent people in the wide nets of overzealous law enforcement.
Seek an experienced child porn defense lawyer
If you face a charge or a possible charge of child sexual abuse based on inaccurate interpretations of the content on your phone or other electronic device, seek help from an experienced sex crime defense lawyer.
Possessing child pornography or images containing child sexual abuse is a serious offense in Texas, with harsh punishments of possibly high fines and years in prison.
These punishments can apply in a variety of ways. In fact, when it comes to a minor under 17 years old, even a “selfie” can be child pornography.
Don’t let a mistaken interpretation of your iPhone’s or computer’s images lead to harsh results. Contact the Neal Davis Law Firm today for a legal review of your case in Houston, Harris County, Fort Bend County or Montgomery County.
Yes, children must be protected — but so must the rights of innocent persons who are unjustly accused of sex crimes.