Britain's stringent crackdown on online pornography has resulted in a substantial financial penalty for one company, with Kick Online Entertainment SA fined £800,000 for lacking robust age verification measures. An investigation by the communications regulator Ofcom revealed that the company failed to comply with mandatory age-check requirements between July 25 and December 29 last year.
Regulatory Action and Additional Penalties
Ofcom noted that Kick has since introduced an age-verification method described as 'capable of being highly effective.' However, the regulator also imposed an additional fine of £30,000 on the company for failing to respond accurately, completely, and promptly to information requests. Furthermore, Ofcom will levy a daily penalty of £200 on Kick until it provides the required responses or for up to 60 days, whichever occurs first.
Enforcement Stance
Suzanne Cater, director of enforcement at Ofcom, emphasised the non-negotiable nature of effective age checks on adult websites to protect children from pornographic content. 'Any company that fails to meet this duty – or engage with us – can expect to face robust enforcement action, including significant fines,' she stated. Ofcom continues to investigate other sites under the UK's age verification rules and will take further action where necessary.
Online Safety Act Requirements
Since July 25, the Online Safety Act has mandated that online platform operators prevent children from accessing 'harmful content,' which includes explicit material like pornography, as well as content promoting self-harm, suicide, dangerous challenges, serious violence, or inciting hatred. Platforms found in breach of the act face potential punishments, including fines of up to £18 million or 10% of global turnover, and in extreme cases, being blocked from operating in the UK.
Age Verification Methods
Pornography providers have seven approved options to verify that visitors are over 18:
- Photo-ID matching
- Facial age estimation
- Mobile-network operator (MNO) age checks
- Credit card checks
- Email-based age estimation
- Digital identity services
- Open banking
Rising Concerns and Industry Response
The legislation was introduced in response to growing concerns about young children accessing disturbing or harmful content online. A study by the charity Internet Matters last year found that 70% of children aged nine to 13 reported exposure to harmful content online, including hate speech, misinformation, and violent material. Ofcom research also indicated that 8% of UK children aged eight to 14 visit a porn site at least monthly.
Following the regulatory crackdown, Pornhub, owned by Cyprus-based Aylo, implemented restrictions on new UK users starting this month. From February 2, the site has blocked new British users who have not previously verified their age. Aylo has criticised the Online Safety Act, arguing that it diverts traffic to 'darker, unregulated corners of the internet' and fails to achieve its goal of protecting minors. 'Despite the clear intent of the law to restrict minors' access to adult content... our experience strongly suggests that the OSA has failed to achieve that objective,' the company stated.
Overview of the Online Safety Act
The Online Safety Act 2023 is a comprehensive set of laws designed to protect children and adults online. It imposes new duties on social media companies and search services, making them more accountable for user safety. Key provisions include:
- Requiring providers to implement systems to reduce risks of illegal activity and remove illegal content.
- Mandating that platforms prevent children from accessing harmful and age-inappropriate content.
- Providing clear reporting mechanisms for problems online.
- Ensuring major platforms are transparent about allowed content and give users more control over what they see.
This regulatory framework aims to create a safer digital environment, with the strongest protections focused on safeguarding children from online harms.



