- iOS and iPadOS users in the UK now have to verify their age
- Otherwise certain features may be disabled for under 18s
- Meta and Google fined for child safety policies
We seem to be reaching a tipping point when it comes to under-18 phone use: Apple is rolling out mandatory age verification for iPhone and iPad users in the UK, just a day after Meta and Google were hit with huge fines in a landmark social media trial.
Starting with Apple’s certification release, this is part of the new iOS 26.4 and iPadOS 26.4 update for users in the United Kingdom. If you’re a UK user, you’ll be asked to register a credit card or scan an ID to prove you’re 18 or over — unless Apple has previously verified your age.
Apple has the full details here, and says the verification process is “required by law in some countries and regions” to “download apps, change certain settings, or take other actions with your Apple Account”. If you need to verify your account, you will see a message appear in the Settings menu.
The article continues below
Although this particular device-level age verification step is not required by UK law as it stands now, recent legislation means that it is required for adult websites (including pornographic sites). The onus was on the sites themselves to do the verification, but there were calls for testing to be done at the device level as well.
With the UK government legalizing social media for under 16s, a law similar to the one implemented in Australia now looks possible. Apple’s intentions are likely to be at the forefront of any such decision, and according to the BBC it has been working closely with the regulator Ofcom on this new feature.
It is not clear what will happen if you are under 18 and cannot verify an old ID. According to Apple’s support document, you may see certain features restricted or asked to join a parent-run Family Sharing group, but the wording suggests that it will vary on a case-by-case basis.
Another possible reason for Apple to take this step is the important social media case that recently came to an end in Los Angeles: Meta and Google were ordered to pay $6m (about £4.5m / AU$8.65m) to a young woman who claimed that Facebook, WhatsApp, and YouTube had a major impact on her mental health.
The woman’s lawyers described the apps developed by Meta and Google as “addiction machines”, saying that the technology companies had not done enough to prevent young children from accessing these platforms, or to protect them from the harm associated with too much screen time.
In a separate case in New Mexico that reached a verdict earlier this week, Meta was separately told to pay a $375m fine (about £281m / AU$541m) for misleading users about child safety protections in its apps. Meta was aware of child attackers on its platforms, and did not do enough to stop them, the judge ruled.
Meta and Google both intend to file a complaint: “The mental health of young people is very complex and cannot be linked to a single application,” said a spokesperson for Meta. “We will continue to defend ourselves vigorously as every case is different, and we remain confident in our record of protecting young people online.”
And although Apple’s age restrictions have been accepted by Ofcom and child protection groups, not everyone is happy about it: some see it as another step towards “mass surveillance” and tracking the data of even more users and logging, while others argue that the responsibilities of protection should lie with parents rather than device manufacturers.
The momentum seems to be one-way at the moment, however – and with AI bots another problem facing the internet, it’s likely that more validation tests will start to appear in the future.
Follow TechRadar for Google news again add us as a favorite resource to get our expert news, reviews, and opinions in your feed. Be sure to click the Follow button!
And of course you can too follow TechRadar on TikTok to get news, reviews, unboxings in video form, and get regular updates from us WhatsApp again.



