What is the Online Safety Act?
The Online Safety Act is a new UK law that aims to make the internet safer, especially for children.
Online platforms have been subject to strict rules enforced by the UK Office of Communication (OfCom), since July 25.
These rules focus on keeping children away from harmful content like nudity and violence.
Under the Act, platforms must stop children from seeing harmful content, including:
- Suicide or self-harm material
- Eating disorder content
- Pornography
- Misogynistic, violent, or abusive posts
- Dangerous online challenges (stunts etc)
This requires platforms to tweak their algorithms to prevent harmful content from appearing in children’s feeds.
Online platforms are also required to verify users’ ages to identify those under 18, and offer support when children are affected.
Failure to comply will result in fines of up to £18 million or 10% of global revenue.
Platforms
The law affects technology companies operating in the UK including social media apps like Facebook and Instagram, Youtube and even games like Roblox which have in game chat features.
Age verification
A safety measure to enforce the act is the implementation of age verification. This will include user uploading of government-issued ID cards, asking age-related questions, and employing AI-powered age estimation technology.
Users may also be required to submit short videos to verify their identity.
These videos may involve specific actions or prompts, such as speaking a pre-determined phrase, showing their face clearly, or providing a view of an ID card.









