Apple to scan US iPhones for images of child abuse
Team Udayavani, Aug 6, 2021, 10:44 AM IST
AFP photo
Washington: Apple unveiled plans to scan US iPhones for images of child abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens.
Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. The tool Apple calls “neuralMatch” will detect known images of child sexual abuse without decrypting people’s messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.
But researchers say the tool could be put to other purposes such as government surveillance of dissidents or protesters.
Matthew Green of Johns Hopkins, a top cryptography researcher, was concerned that it could be used to frame innocent people by sending them harmless but malicious images designed designed to appear as matches for child porn, fooling Apple’s algorithm and alerting law enforcement — essentially framing people.
“This is a thing that you can do,” said Green. “Researchers have been able to do this pretty easily.”
Tech companies including Microsoft, Google, Facebook and others have for years been sharing “hash lists” of known images of child sexual abuse. Apple has also been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images.
The company has been under pressure from governments and law enforcement to allow for surveillance of encrypted data. Coming up with the security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.
Apple believes it pulled off that feat with technology that it developed in consultation with several prominent cryptographers, including Stanford University professor Dan Boneh, whose work in the field has won a Turing Award, often called technology’s version of the Nobel Prize.
Apple was one of the first major companies to embrace “end-to-end” encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, has long pressured for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.
“Apple’s expanded protection for children is a game changer,” John Clark, the president and CEO of the National Centre for Missing and Exploited Children, said in a statement. “With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”
Julia Cordua, the CEO of Thorn, said that Apple’s technology balances “the need for privacy with digital safety for children.”
Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.
Udayavani is now on Telegram. Click here to join our channel and stay updated with the latest news.
Top News
Related Articles More
Snatcher lands in police net in Delhi, AI tech helps reveal identity
AI Meets Health: The Rise of Smart Fitness Solutions
Power Up by Powering Down: 10 Energy-Saving Tips for Every Home
Multi-lingual AI chatbot to assist visitors during Maha Kumbh Mela 2025
ISRO carries out ‘well deck’ recovery trial of Gaganyaan
MUST WATCH
Latest Additions
Rajasthan govt to replace Urdu terms in policing with Hindi words
BJP using legislature for ‘politics’ instead of discussing real issues: CM Siddaramaiah
Congress twisted facts, distorted my statement on Ambedkar: Amit Shah
Govt will not remove temples built on Waqf properties, CM Siddaramaiah tells Assembly
Not God, but Constitution that saves oppressed people: Karnataka Minister Mahadevappa
Thanks for visiting Udayavani
You seem to have an Ad Blocker on.
To continue reading, please turn it off or whitelist Udayavani.