Parents can activate the feature on their child’s iOS device
Apple will continue to offer nudity detection and child protection in the Messages app in iOS 15.2, but parents must enable it.
When Apple first introduced its child protection features, they got a pretty harsh response, causing delays in its planned rollout. The biggest privacy concern, scanning iCloud photos for child sexual abuse material (CSAM), is still on hold, but according to Bloomberg, the Messages update is scheduled to roll out on iOS 15.2. However, Apple says it is not enabled by default and image analysis will be performed on the device, so you do not have access to potentially sensitive material.
RF Images/Getty Images
RF Images/Getty Images
Once enabled, the feature will use on-device machine learning to detect if photos sent or received in messages contain explicit material, according to Apple. This obscures potentially explicit incoming images and warns or alerts children when they send potentially explicit content.
In either case, the child has the option to contact the parent and tell them what happened. In the FAQ list, Apple says that for accounts with children under the age of 12, it will warn children and will contact parents if they view/send explicit material. For child accounts aged 13 to 17, children will be warned of potential risks, but parents will not be contacted.
in the same
Frequently Asked Questions
Apple insists it will not share any information with third parties, including Apple, law enforcement, or the National Center for Missing and Exploited Children (NCMEC).
According to sources, these new Message parental controls should be available in the next iOS 15.2 update, which is expected to roll out sometime this month. Mike World†
iOS 15.2 Nudity Detection Will Be On-Device and Opt-In
Parents will be able to activate the feature on their kids’ iOS devices
Apple will be pushing ahead with its nudity-detecting, child protection feature in the Messages app for iOS 15.2, but parents will have to turn it on.
When Apple first revealed its child protection features, they were met with a fairly critical response, resulting in a delay of the planned roll-out. The biggest privacy concern—Apple scanning iCloud photos for Child Sexual Abuse Material (CSAM)—is still on hold, but according to Bloomberg, the Messages update is slated for release with iOS 15.2. Apple says it won’t be on by default, however, and that image analysis will be happening on-device, so it won’t have access to potentially sensitive materials.
RF Pictures / Getty Images
According to Apple, once enabled, the feature will use on-device machine learning to detect whether sent or received photos in Messages contain explicit material. This will blur potentially explicit incoming images and warn the child or give them a warning if they’re sending something that might be explicit.
In both cases, the child will also have the option to contact a parent and tell them what’s going on. In a list of Frequently Asked Questions, Apple states that for child accounts 12 and under, the child will be warned that a parent will be contacted if they view/send explicit material. For child accounts between ages 13-17, the child is warned of the potential risk, but parents will not be contacted.
In the same
, Apple insists that none of the information will be shared with outside parties, including Apple, law enforcement, or the NCMEC (National Center for Missing & Exploited Children).
These new child safety options for Messages should be available in the upcoming iOS 15.2 update, which is expected to roll sometime this month, according to Macworld.
#iOS #Nudity #Detection #OnDevice #OptIn
- Synthetic: Tài Chính Kinh Doanh
- #iOS #Nudity #Detection #OnDevice #OptIn