The Online Safety Act UK introduces a comprehensive legal framework to improve online safety and reduce harm for both children and adults. With increasing concerns over illegal content, harmful material, and misinformation, the Act mandates stricter compliance measures for online service providers, including social media platforms, search engines, video-sharing services, and cloud storage providers.
Enforced by Ofcom, the Act introduces severe penalties for non-compliance, including fines of up to £18 million or 10% of global revenue. Companies failing to meet their obligations could also face criminal liability for senior executives. With implementation deadlines quickly approaching, service providers must act now to avoid legal repercussions.
Key Protections Under the Act
The Act establishes several new duties for online platforms, including:
Protection for Children:
- Platforms must introduce age assurance mechanisms to prevent children from accessing harmful and age-inappropriate content.
- Services hosting pornographic content must implement age verification measures by January 2025.
- By April 2025, platforms must assess whether their services are likely to be accessed by children and take appropriate steps to ensure child safety.
Safety for Adults:
- Major platforms must be transparent about the types of potentially harmful content they allow.
- Users must be given more control over their content preferences, including the ability to filter out harmful material.
Illegal Content Removal:
- Online services must introduce systems to prevent illegal activities on their platforms and ensure swift removal of illegal content.
- As of 17 March 2025, platforms must demonstrate compliance with illegal content removal duties.
Who Must Comply?
The Act applies to a wide range of online services, including:
- Social media platforms
- Messaging apps
- Online forums
- Video-sharing services
- Cloud storage providers
Note: These obligations extend to companies based outside the UK if their services are accessible to UK users and pose a material risk of harm.
Enforcement and Oversight by Ofcom
Ofcom, the UK’s communications regulator, is responsible for:
- Developing codes of practice to guide compliance.
- Conducting risk assessments on online services.
- Enforcing compliance through penalties and investigations.
Non-compliance can result in:
- Fines up to £18 million or 10% of global revenue (whichever is higher).
- Criminal liability for senior executives if they fail to meet legal obligations.
Online Safety Act UK: Do’s and Dont’s for Users & Service Providers
For Online Providers
- Conduct comprehensive risk assessments – Identify potential risks related to illegal and harmful content and implement measures to mitigate these risks.
- Implement age verification measures – Ensure age assurance systems prevent underage access to harmful content.
- Be transparent – Clearly communicate content moderation policies and provide accessible user reporting tools.
- Cooperate with Ofcom – Respond promptly to requests for information and demonstrate compliance.
- Ignore illegal content – Failure to remove or prevent the spread of illegal material will lead to severe penalties.
- Overlook user safety controls – Platforms must provide tools allowing users to filter and control their online experience.
- Delay compliance efforts – With enforcement already underway, platforms that fail to act now will face regulatory action.
For Users
- Use reporting tools – If you encounter harmful or illegal content, report it to the platform.
- Engage parental controls – Parents should use available tools to limit children’s exposure to harmful material.
- Stay informed – Regularly check platform policies and ensure your privacy settings are up to date.
- Ignore age restrictions – Follow age guidelines to prevent exposure to inappropriate content.
- Share personal information carelessly – Be cautious with what you post online to avoid identity theft or harassment.
Key Implementation Dates
31 January 2024 | New criminal offences (e.g., cyberflashing, intimate image abuse, and sending false information with intent to cause harm) are now enforceable. |
January 2025 | Platforms hosting pornographic content must introduce age verification checks |
17 March 2025 | Platforms must comply with illegal content removal duties. |
April 2025 | All platforms must assess whether their services are likely to be accessed by children and implement safety measures |
Summer 2025 | Ofcom will publish a register of Category 1 services (large platforms subject to stricter transparency and accountability rules). |
Steps to Implement the Act
The Online Safety Act is a landmark piece of legislation that redefines digital safety in the UK. The strict new obligations on online platforms aim to reduce harm, protect children, and provide greater transparency for all users.
With enforcement already underway, organisations need to assess their compliance status and take necessary steps to meet regulatory obligations.
- For Service Providers: Ensure your organisation is fully compliant with age verification, illegal content removal, and user safety controls before the 2025 deadlines.
- For Businesses Using Online Platforms: Review agreements with third-party services to ensure they comply with the Act.
- For Individuals: Stay informed about platform safety features and make use of content control tools where available.
Written by
Lynsey Hanson |Global Data Protection Officer
lynsey.hanson@tenintel.com