Some Discord users now need to scan their face or ID to view sensitive material
The company says the age verification feature is “an experiment”

Discord has added a new age verification system for some users that requires them to scan their face or ID to access sensitive material.
The new process, which Discord describes as “an experiment”, is currently active in the UK and Australia in response to recent decisions made by local governments to restrict children from accessing sensitive material online.
According to Discord, users may be asked to verify their age when encountering content flagged by its sensitive media filter, or when they try to change their settings to allow the viewing of sensitive content.
The user will be given two verification options. The first asks them to scan their face, using their webcam or phone camera, and following the on-screen instructions to submit their image.
Alternatively, they can use their mobile device to scan a QR code, which takes them to a page where they can take a photo of an ID document (such as their passport or driving licence) and submit this to Discord.
Users will then receive system messages from Discord informing them about the verification process, followed by a DM from the official Discord account telling them what their verified age group is. According to Discord, this automated process “typically takes just a few minutes”.
Discord says age verification is a one-time process, but that if users believe their age group is incorrect – for example, if they’re an adult but a face scan determines that they’re a child – they can try to verify their age again, either by using the same automated process or opting for a manual review.

If users are too young to use Discord in their country and age verification determines that their age is below the minimum requirement, they could be banned from Discord until they submit an appeal through its underage appeals process.
Whereas age verification online has generally been a case of the user entering their date of birth, in recent years governments have noted that this provides little to no protection to children online, who can just enter a fake date of birth to proceed.
Last year Australia’s parliament became the first country to pass a law banning social media for children under 16. The ban, which is still to come into effect, requires tech companies to take “reasonable steps” to prevent underage users from gaining access to such sites.
It could be the case, then, that when this law fully comes into effect, Discord will require age verification from Australian users for any access to the platform, not just sensitive media.
In the UK, regulator Ofcom has stated that all websites where pornographic material can be found, including social media platforms, need to add “robust” age-checking techniques such as photo ID or credit card checks by July.
Discord’s new age verification appears to be a direct reaction to the decisions made by Australia and the UK, meaning if other countries follow suit with similar restrictions users in those countries can expect to face similar checks.