×
Friday, April 26, 2024

Apple’s New Child Abuse System is Under Collision, and Apple States it’s “Not a Concern”

Last updated Saturday, October 9, 2021 16:04 ET , Source: NewsService

New technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos

Dallas, TX, United States, 10/09/2021 / SubmitMyPR /

Apple has received counterblast recently due to their CSAM (Child Sexual Abuse Material) technology. In the last month, Apple decided to launch the update stating that it will start scanning the photos in iCloud to identify if there are any images an account carries related to child abuse. But the company is now receiving backlash for not coming up with accurate algorithms.

The company called their scanning system as neuralHash. Apple was about to launch the update officially, but before that, they encountered repercussions. Billions of Apple users could have received this update. Multiple security experts and researchers claimed that the company's algorithms are flawed and are producing defective results.

Apple stated about their new child abuse detection system;

"iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos."

The latest technology is not possessed by Apple, and they are end-to-end encrypted. The company stated that the system does not hinder users' privacy, and they only scan for child abuse through their CSAM (Child Sexual Abuse Material).

However, many security updates used reverse engineering methods and claimed that the data apple's CSAM (Child Sexual Abuse Material) produces is defective, and hashing algorithms are incorrect.

On the contrary, Apple rejected the concerned issues and updated its users. Apple claimed that if a CSAM neursalHash identifies a problem and flags it, it is checked again under a secondary system. If a problem contains any error, it is sorted out before it reaches out to users or other humans.

Apple stated that it concerns the issues such as collisions and is prepared to sort them out. It said that not only our nueralHash systems are properly organized to locate and scan the images for child abuse, but there are also secondary level hashing algorithms implied as secondary server-side.

Apple stated that they are working in collaboration with NCMEC (National Center for Missing and Exploited Children). Apple will report any detected issue to the NCMEC for further security actions.

Apple stated about the NCMEC and CSAM on their official website;

New technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and collaborates with law enforcement agencies across the United States.

Rejecting the security researchers claim, Apple said that even if someone tried to generate the alert, they would be in high need of accessing those NCMEC databases for hashes. This is not easy, and even if someone successfully produces it, all they can cause is an alert notification without any inference.

It is important to remember that the Electronic Frontier Foundation recently filed a petition against Apple to stop scanning their data and images for flawed results.


References
  • https://www.apple.com/child-safety/

Original Source of the original story >> Apple’s New Child Abuse System is Under Collision, and Apple States it’s “Not a Concern”