Apple has announced that it will launch a variety of child safety features across Messages, Photos, and Siri later this year.
To begin, the Messages app will have new notifications that will warn children, as well as parents, when they send or receive sensitive content. According to Apple, the Messages app will use on-device machine learning to analyse image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.
The Messages app will also notify parents if their children attempt to view a sensitive image. They will receive warnings like, "It's not your fault, but sensitive photos and videos can be used to hurt you", according to a screenshot shared by Apple. Users don't have to worry as Apple will not have access to the messages. The feature will be available to family iCloud accounts.
In addition, Apple will introduce a new software tool in iOS and iPadOS called Child Sexual Abuse Material (CSAM) to detect when someone uploads content that shows children involved in sexually explicit acts on the iCloud, which will then be reported to the National Center for Missing and Exploited Children (NCMEC), a non-profit organisation that works in collaboration with U.S. law enforcement agencies. Instead of scanning the photos when they are uploaded to the cloud, Apple said the system will perform on-device matching against a database of "known" images provided by the NCMEC and other organisations. The database will assign a hash to the photos, which acts as a digital fingerprint for them, sort of.
Lastly, Apple will bring child safety resources to Siri to help children and parents stay safe online. Apple also plans to update Siri to intervene when someone tries to conduct any CSAM-related searches.
The new features are expected to come in updates to the iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
Comments