ADVERTISEMENTREMOVE AD

Why Apple's New Feature to Detect Child Abuse is Problematic

Apple said it is using a hashing tool that is designed to analyse an image and map it into a set of unique numbers.

Published
Tech and Auto
3 min read
story-hero-img
i
Aa
Aa
Small
Aa
Medium
Aa
Large
Hindi Female

Apple faced backlash from its users after the company on Friday, 6 August announced its decision to roll out a new feature for checking photos for child sexual abuse material (CSAM) on iPhones.

The US-based tech giant took this decision to curb the distribution of child porn — days after tech firms, reported the existence of as many as 45 million photos and videos — that constituted child sex abuse material.

While the move to detect child sexual abuse material is in the right direction, scanning the iCloud storage of users would be a huge setback for personal privacy.
ADVERTISEMENTREMOVE AD

How Does it Work?

Apple said it is using a hashing tool, called NeuralHash, that is designed to analyse an image and map it into a set of unique numbers called as 'hashes'.

The software then performs a match using a database of known image hashes of child sexual abuse provided by the National Center for Missing and Exploited Children (NCMEC).

If a match is found, the image is sent for manual review and upon confirmation, the user’s account will be blocked and NCMEC will be notified.

Why is it Alarming?

Apple's new software to detect CSAM could open the door to new forms of widespread surveillance and serve as a breakthrough for encrypted communications.

Several Apple users on Twitter expressed concerns that their phones may contain personal pictures which can potentially be flagged by the algorithm as CSAM imagery.

Concerns are also based around whether the new feature will break end-to-end encryption for iMessage users.

The Center for Democracy and Technology, in a statement called the new update an erosion of the privacy: “The mechanism that will enable Apple to scan images in iMessages is not an alternative to a backdoor — it is a backdoor. Client-side scanning on one ‘end’ of the communication breaks the security of the transmission, and informing a third-party (the parent) about the content of the communication undermines its privacy."

It is important to note that the critics aren't against Apple’s mission to fight CSAM, but fear the tools that it’s using to do so.

"Scanning capabilities similar to Apple’s tools could eventually be repurposed to make its algorithms hunt for other kinds of images or text—which would basically mean a workaround for encrypted communications, one designed to police private interactions and personal content."
Electronic Frontier Foundation (EFF), a privacy-focused think tank

Meanwhile, Apple confirmed to MacRumors on Friday that it plans to expand the features on a country-by-country basis.

0

'Wrong Approach': WhatsApp on Apple's Plan

WhatsApp head Will Cathcart explained why the company won't adopt Apple's system.

"I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world,” tweeted WhatsApp head Cathcart.

Cathcart also raised concerns about Apple's decision to scan all the private photos on a phone instead of adding features to report such content.

“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable,” Cathcart added.

What Facebook Thinks

Earlier, in April, Apple released its most controversial iPhone update iOS 14.5, which prevents advertisers from tracking users' data across apps.

Responding to Apple's move, Facebook in April said that Apple’s new system will make it harder and more expensive for advertising networks to target customers, which will hurt small businesses that rely on targeted ad campaigns.

Interestingly, Facebook had also published full page ads protesting against the changes in the Apple’s new privacy policy.

Therefore, it's not entirely surprising that Facebook-owned WhatsApp has come out so emphatically against Apple's new move.

(At The Quint, we are answerable only to our audience. Play an active role in shaping our journalism by becoming a member. Because the truth is worth it.)

Read Latest News and Breaking News at The Quint, browse for more from tech-and-auto

Topics:  Apple   Child Abuse 

Speaking truth to power requires allies like you.
Become a Member
3 months
12 months
12 months
Check Member Benefits
Read More
×
×