Apple and Google will be asked to block nude photos unless user age is verified - 9to5Mac
The UK government is preparing to ask Apple and Google to block the taking, sharing, and viewing of nude images on iOS and Android unless the user’s age is verified as 18+. First reported by the Financial Times, the request would push the companies to build nudity-detection algorithms into their operating systems and restrict any nude content behind adult verification, potentially using biometrics or official ID. An announcement is expected in the coming days, and for now it appears to be a request rather than a legal mandate.
Context: A growing policy trend favors making app stores—not individual app developers—responsible for age verification. In the US, the App Store Accountability Act proposes a single, centralized verification process through Apple or Google, which would then age-gate apps and features. While Apple has reportedly lobbied against this approach, some argue it could be the most practical solution to protect minors while reducing user friction.
What’s being proposed in the UK:
- OS-level nudity detection to prevent capturing or sharing genital imagery unless the user is verified as an adult.
- System-wide screening to block any nude content from being displayed unless the user completes age verification (biometrics or official ID).
- A voluntary, nonbinding request at the outset, signaling the government’s preferred direction without immediate legal compulsion.
Current Apple safeguards: Apple’s Messages Communication Safety already blurs sexually explicit images for children in an iCloud Family group, warns them before viewing, and alerts the family organizer if the child proceeds. The UK proposal would go far beyond Messages, extending protections to device-wide photo capture, sharing, and viewing.
Why it matters: The plan aims to curb grooming, sextortion, and the spread of CSAM by making it harder for offenders to solicit or coerce minors into sharing explicit content. However, it also raises major privacy concerns. Monitoring photos and in-app content at the OS level—even if processed on-device—will be seen by many as a step too far, potentially normalizing intrusive screening that could erode user trust.
9to5Mac’s take: This is the most controversial age-verification idea to date. It highlights a real and urgent problem—teens being manipulated into sharing images and then blackmailed, with tragic outcomes—yet also surfaces profound questions about privacy, scope, and the feasibility of universal nudity detection. At minimum, the proposal could catalyze a broader debate on balanced, effective child-safety measures that respect user privacy.
What to watch next:
- Whether Apple and Google publicly support, modify, or reject the request.
- If the UK moves from voluntary guidelines to regulation.
- How any solution handles verification accuracy, inclusivity, and data security while minimizing false positives and preserving privacy.