News

iOS 15.2 Nudity Detection Will Be On-Device and Opt-In

Parents can enable this feature on their child’s iOS device.

Apple will be pushing nudity detection and parental controls in the Messages app for iOS 15.2, but parents will need to enable it.

When Apple first unveiled parental controls, it met with quite a critical response, delaying its planned launch. Apple’s biggest privacy concern, retrieving child sexual abuse material (CSAM) from iCloud Photos is still pending, but according to Bloomberg, the Messages update is expected to roll out with iOS 15.2. However, Apple says it’s not turned on by default and image analysis is done on the device, making potentially sensitive material inaccessible.

RF Images/Getty Images

According to Apple, when enabled, the feature uses on-device machine learning to detect whether photos sent or received in Messages contain explicit material. This will correct incoming images that are potentially objectionable and warn or warn children if they send you objectionable content.

In either case, the child also has a chance to contact the parent and tell them what is happening. In a list of frequently asked questions, Apple states that accounts for children under the age of 12 will receive a warning that the parent will be contacted if the child views or sends pornography. For accounts with children between the ages of 13 and 17, children will be warned of potential risks, but parents will not be contacted.

child protection in the news

apologize

same

Frequently Asked Questions

Apple insists that no information be shared with any third party, including Apple, law enforcement agencies or the National Center for Missing & Exploited Children (NCMEC).

These new parental control options for messages will be available in the iOS 15.2 update, which is due to arrive this month. Macworld.


More information

iOS 15.2 Nudity Detection Will Be On-Device and Opt-In

Parents will be able to activate the feature on their kids’ iOS devices

Apple will be pushing ahead with its nudity-detecting, child protection feature in the Messages app for iOS 15.2, but parents will have to turn it on.

When Apple first revealed its child protection features, they were met with a fairly critical response, resulting in a delay of the planned roll-out. The biggest privacy concern—Apple scanning iCloud photos for Child Sexual Abuse Material (CSAM)—is still on hold, but according to Bloomberg, the Messages update is slated for release with iOS 15.2. Apple says it won’t be on by default, however, and that image analysis will be happening on-device, so it won’t have access to potentially sensitive materials.

RF Pictures / Getty Images

According to Apple, once enabled, the feature will use on-device machine learning to detect whether sent or received photos in Messages contain explicit material. This will blur potentially explicit incoming images and warn the child or give them a warning if they’re sending something that might be explicit.

In both cases, the child will also have the option to contact a parent and tell them what’s going on. In a list of Frequently Asked Questions, Apple states that for child accounts 12 and under, the child will be warned that a parent will be contacted if they view/send explicit material. For child accounts between ages 13-17, the child is warned of the potential risk, but parents will not be contacted.

Apple

In the same
FAQ
, Apple insists that none of the information will be shared with outside parties, including Apple, law enforcement, or the NCMEC (National Center for Missing & Exploited Children).

These new child safety options for Messages should be available in the upcoming iOS 15.2 update, which is expected to roll sometime this month, according to Macworld.

#iOS #Nudity #Detection #OnDevice #OptIn

iOS 15.2 Nudity Detection Will Be On-Device and Opt-In

Parents will be able to activate the feature on their kids’ iOS devices

Apple will be pushing ahead with its nudity-detecting, child protection feature in the Messages app for iOS 15.2, but parents will have to turn it on.

When Apple first revealed its child protection features, they were met with a fairly critical response, resulting in a delay of the planned roll-out. The biggest privacy concern—Apple scanning iCloud photos for Child Sexual Abuse Material (CSAM)—is still on hold, but according to Bloomberg, the Messages update is slated for release with iOS 15.2. Apple says it won’t be on by default, however, and that image analysis will be happening on-device, so it won’t have access to potentially sensitive materials.

RF Pictures / Getty Images

According to Apple, once enabled, the feature will use on-device machine learning to detect whether sent or received photos in Messages contain explicit material. This will blur potentially explicit incoming images and warn the child or give them a warning if they’re sending something that might be explicit.

In both cases, the child will also have the option to contact a parent and tell them what’s going on. In a list of Frequently Asked Questions, Apple states that for child accounts 12 and under, the child will be warned that a parent will be contacted if they view/send explicit material. For child accounts between ages 13-17, the child is warned of the potential risk, but parents will not be contacted.

Apple

In the same
FAQ
, Apple insists that none of the information will be shared with outside parties, including Apple, law enforcement, or the NCMEC (National Center for Missing & Exploited Children).

These new child safety options for Messages should be available in the upcoming iOS 15.2 update, which is expected to roll sometime this month, according to Macworld.

#iOS #Nudity #Detection #OnDevice #OptIn


Synthetic: Vik News

Vik News

Viknews Vietnam specializes in sharing useful knowledge about marriage - family, beauty, motherhood experience, nutritional care during pregnancy, before and after birth, lipstick, royal jelly, home and furniture. (wooden doors, decorative chandeliers, dining tables, kitchen cabinets..)……

Trả lời

Email của bạn sẽ không được hiển thị công khai. Các trường bắt buộc được đánh dấu *

Back to top button