CSAM

Mobile Forensic Analyst

Timothy R. Primrose, Mobile Forensic Analyst

Apple is working to fight against child sexual abuse by scanning photos that are uploaded to iCloud for CSAM (child sexual abuse material). Apple is also trying to battle the child grooming chain by analyzing incoming and outgoing pictures in the Messages application for sexually explicit imagery. Facebook, Google, and Microsoft already have parameters set in place to monitor this form of illegal content on cloud-based platforms.

Apple will use an encrypted database of known CSAM images provided by different participating child safety organizations to scan these photos. An algorithm called NeuralHash will run each photo uploaded to iCloud from an Apple device through the CSAM database and will attempt to search for a match, even if the photo was altered, resized, or cropped.

If a photo is a positive match, it is run against a second CSAM database. If the result is still positive, someone from Apple will manually review the photo. Once the Apple reviewer confirms that the photo is illegal, they will disable the person’s account and contact a child safety organization and law enforcement. Apple does not have consent to scan photos that are on a user’s Apple device, only photos that are uploaded to the cloud; however, parents can choose to allow Apple to analyze pictures sent and received from their child’s cellphone.

The communication safety feature in the Messages application is accessible when Family Sharing is activated for users under the age of 13. This feature warns a child if they are about to view or send a sexually explicit picture. The child will still have the choice to view the picture; however, if they do, they will receive a second warning, stating that a parent will be notified, and asks if they still want to proceed.

If you learn that your child is a victim of grooming or sexual abuse, contact local law enforcement immediately. Preserving data as quickly as possible will ensure that law enforcement and forensic investigators will have the ability to discover and track down the offenders involved.

If you would like to read more to grasp a better understanding of the process, Apple released a document outlining their child safety features: https://www.apple.com/child-safety/pdf/Security_Threat_Model_Review_of_Apple_Child_Safety_Features.pdf

Categories: Uncategorized

Tags: Mobile Forensic Analyst | Timothy R. Primrose

 

Have A Question About This Article or Want to Contact the Expert?

Request An Expert

Fill out the form below so we may refer an expert

Do you have a question for us? We’re here to help!

James Schmidt Expert Spotlight