Overview
The internet has become central to communication, commerce, and community. But alongside its benefits, there are significant risks, especially for children and vulnerable users. In response, the UK has taken a vital legislative step with the Online Safety Act (OSA), one of the most comprehensive digital safety frameworks introduced so far.
Officially enacted in October 2023, the OSA is being introduced gradually between 2024 and 2026. The law imposes legally binding obligations on online platforms, search engines, and content-sharing services to assess and reduce harm, particularly concerning illegal content or access by minors.
The regulator Ofcom has been tasked with overseeing compliance and enforcement, with broader powers to investigate breaches, impose significant penalties, and even restrict access to non-compliant services.
Core obligations of the Online Safety Act
The OSA establishes a statutory duty of care for online services, requiring proactive measures to minimise the risk of harm to users. These duties vary based on the size, nature, and reach of each platform, but all regulated services must adhere to essential obligations.
Illegal Content Regulation
All in-scope services must identify, assess, and mitigate the risk of illegal content appearing on their platforms. This includes:
- Terrorism-related content
- Child sexual exploitation and abuse (CSEA) material
- Hate speech and incitement to violence
- Fraudulent or scam content
From March 2025, these risk assessments and mitigation strategies became enforceable. Platforms will be expected to demonstrate how they detect and remove illegal content, and whether they have appropriate user reporting mechanisms in place.
Failure to act on illegal content may result in heavy penalties or enforcement action.
Protection of Children
Protecting children online is a key focus of the OSA. Platforms likely to be accessed by children must implement:
- Age assurance technologies to prevent underage access
- Content filtering systems to shield minors from harmful material
- Risk assessments focusing specifically on under-18 users.
These child-specific protections must be in place by July 2025. standard will be higher for platforms likely to attract large numbers of young users, such as video-sharing apps, games, and social media platforms.
Importantly, even smaller services with low direct engagement must comply if their content is accessible to UK-based children.
User-to-User and Search Services
The Act applies not only to social media but to a wide range of user-to-user services and search engines. This includes:
- Messaging apps and forums
- Live-streaming platforms
- Gaming environments with user interaction
- Search services that index and display user-generated material
These services must conduct regular safety risk assessments and update their governance processes accordingly. Those operating in the UK or accessible from the UK are within scope, regardless of where the service is based.

What measures should businesses take to be compliant with the Online Safety Act?
Enforcement Powers & Penalties
The enforcement of the Online Safety Act is led by Ofcom, which has been granted new investigatory and fining powers under the Act.
Key enforcement mechanisms include:
- Fines of up to £18 million or 10% of global turnover, whichever is greater
- Service restriction orders, which can block UK access to non-compliant platforms
- Information notices and audits, allowing Ofcom to inspect internal risk assessments and moderation systems
- Naming and shaming powers, enabling public announcements about serious breaches
Ofcom is expected to take a risk-based and proportionate approach, offering guidance and phased enforcement. However, for larger platforms or services dealing with high-risk content, the threshold for compliance will be significantly higher.
Challenges & Criticisms
Burden on Smaller Firms
One of the most debated aspects of the OSA is the ‘compliance burden it imposes’, particularly on start-ups, niche platforms, and community forums. Risk assessments, user reporting tools, algorithm adjustments, and content moderation protocols all require time, expertise, and cost.
The government has acknowledged this, offering a “categorisation” system that sets lighter requirements for smaller and lower-risk services. However, even these providers must still prove they have taken ’reasonable and proportionate steps’ to reduce online harm.
Legal advice will be crucial in helping such organisations strike the right balance between compliance and operational feasibility.
Encryption and Privacy Concerns
Another point of contention is the potential clash between safety and privacy, especially around end-to-end encryption. The Act includes provisions requiring services to use accredited technologies to detect and remove illegal content, even within encrypted environments.
Critics, including privacy campaigners and tech firms, argue that these obligations could undermine encryption and set dangerous precedents. The government maintains that detection mechanisms must be “technically feasible and proportionate,” and recent guidance suggests enforcement will be delayed until workable solutions are found.
Nonetheless, platforms relying on encryption must stay alert to shifting technical standards, and engage proactively with regulators to avoid non-compliance.
Practical Implications for Businesses & Creators
The Online Safety Act doesn’t just apply to major platforms—it impacts a broad ecosystem of digital service providers, content creators, influencers, and technology businesses.
For Platforms and Tech Companies
- Conduct formal risk assessments across services and user groups
- Implement robust content moderation tools and internal policies
- Build in age verification measures and filtering for younger audiences
- Prepare for Ofcom inspections, data requests, or audit-style reviews
- Document governance procedures and staff training on safety obligations
For Influencers and Content Creators
- Understand platform-specific content policies tied to OSA compliance
- Avoid publishing or linking to material that could trigger illegal content concerns
- Ensure your content is appropriately flagged for age-suitability
- Work with platforms that offer strong compliance tools and transparent reporting mechanisms
For Find Me A Solicitor legal advisers
Solicitors advising in this area will be central to helping clients:
- Interpret their legal obligations under the OSA
- Draft compliant terms of use, moderation policies, and user reporting systems
- Address issues around data protection, liability, and cross-border access
- Mitigate regulatory risk while preserving operational flexibility
The Online Safety Act is a landmark piece of digital regulation, and in the first weeks of enactment has shown sweeping changes across the digital landscape.
From March 2025 onwards, platforms, content providers, and search engines operating in the UK will face significant new obligations, backed by active regulatory enforcement and meaningful penalties. If your business operates in digital media, user engagement, search, or online content creation, now is the time to:
- Complete a compliance audit
- Review content governance and moderation practices
- Update contracts, platform policies, and risk documentation by help from our legal panel of experts
- Engage with Ofcom guidance and sector-specific expectations
Need Help? Find Me A Solicitor can assist
Whether you’re a platform operator, tech founder, influencer, or digital entrepreneur, staying ahead of the Online Safety Act is essential for safeguarding your reputation, operations, and legal standing.
Need Help? Find Me A Solicitor can assist you with leading media and technology solicitors who understand the nuances of online regulation, platform liability, and content governance.
Find Me A Solicitor can connect you with leading media and technology solicitors who understand the nuances of online regulation, platform liability, and content governance. Our network is ready to support you with:
- Risk assessments and compliance strategy
- Safe-by-design systems and policy drafting
- Regulatory engagement and Ofcom responses
- Contractual frameworks for content, users, and third parties
Stay safe, stay compliant—and let us help you do it right.