
By Gavin Boyle
As Apple continues its work to protect children from inappropriate content, the iPhone maker has now introduced a setting which will automatically pause a FaceTime video if it detects nudity over the app.
“Detect nude photos and videos before they are viewed on your device, and receive guidance to help make a safe choice,” the description of the setting reads, per Engadget. “Apple does not have access to photos or videos.”
In iOS 26 FaceTime will pause the Video if you’re undressing while on a FaceTime call here’s the on screen prompt warning that you get asking if you would like to resume audio and video or End the call.👇 pic.twitter.com/fBs0aKUPCy
— iDeviceHelp (@iDeviceHelpus) July 2, 2025
The setting is on by default for child accounts and can be toggled on or off on all devices through FaceTime settings under the “Sensitive Content Warning” tab. The new setting is only available to users updated to iOS 26 and later.
“If your child receives or attempts to send photos or videos that might contain nudity, Communication Safety warns them, gives them options to stay safe, and provides helpful resources,” Apple explains on its website. “Communication Safety uses on-device learning to analyze photo and video attachments and determine if a photo or video appears to contain nudity. Because the photos and videos are analyzed on your child’s device, Apple doesn’t receive an indication that nudity was detected and doesn’t get access to the photos or videos as a result.”
This new setting comes as Apple becomes more focused on protecting children and sensitive users from inappropriate content. In April 2022, the company began blurring nudity found in kids’ Messages apps. This new setting for FaceTime, and further expansions of it, are likely in response to the rise of self-generated child sexual abuse material as predators have begun to target children online to exploit their sexuality.
Related: Apple Launches Safety Feature That Blurs iMessage Nudity
A report from 2020 found that found that the amount of child sexual abuse material shared by children aged 9-12 more than doubled during the pandemic. The National Center on Sexual Exploitation’s (NCOSE) placed Apple on its Dirty Dozen list in 2023 and 2024 because it continued to enable this predatory behavior.
NCOSE maintains that Apple continues to enable sexual exploitation by not doing enough to protect minors and also allowing young users to download inappropriate apps that increase their risk of being abused.
Meanwhile, third party apps have made their experiences safer for kids, such as Google who introduced a nudity blur on its search engine in 2023. The feature is automatically applied to users under the age of 18 and cannot be changed without the approval of a parent or administrator.
While child sexual abuse material continues to be a major problem with the tech avenues of today, it is encouraging that the companies in charge keep introducing new features to help stem the issue.
Read Next: Google Introduces Blur Feature That Blocks Explicit Images from Search Results
Questions or comments? Please write to us here.