The recent enforcement of the Online Safety Act, specifically the age/ID verification process to access certain content online, has been controversial, with varying results to date.
At Basis Yorkshire, we work with sex workers, as well as adults and young people (under 18) who have experienced sexual exploitation. We work to end stigma, create safety and promote empowerment for these groups of people.
What is it?
The Online Safety Act 2023 (OSA) aims to protect children and adults online, putting more responsibility on social media companies and search services through legal accountability. Part of this act includes child safety, and as of 25th July 2025, platforms are now required to use age verification systems to prevent children from accessing pornography or content that encourages self-harm, suicide or eating disorders, as well as preventing children from accessing other harmful content, such as hateful content or ingesting dangerous substances.
Ofcom can impose hefty fines to companies that do not comply; up to £18m or 10% of global turnover or even applying to the courts to block their services in the UK.
Effect on Sex Workers
The effects of the OSA on the sex work community have been immediate. Workers across all online-facilitated types of work have reported a drop in clients since the ban, also resulting in a respective loss of income. In the run up to the implementation of age verification, one sex worker shared, “it’s pushing all my friends into overworking. They’re scared they’ll stop getting clients and all their income will vanish. People are seeing back-to-back clients, not screening properly. It’s unsafe but they feel like they have no choice.” Workers have also shared the impact of other platform policies on their work, including shadowbanning, account deletion and censorship, which have combined with the OSA to disastrous effect.
When sex workers can no longer earn the income they need from online work or using online platforms, they become more likely to turn to in-person and/or managed forms of work, placing them at greater risk of violence and exploitation. As Maedb Joy writes, “when sex workers are banned from advertising services online, we’re pushed into selling sex on premises – in brothels, strip clubs, or even on the street. But working under establishments often comes at a cost: exploitative bosses who’ve never done sex work themselves, profiting from our labour while offering little in return. In these spaces, where the ratio of workers to clients is high, we’re forced into competition. Clients haggle our rates, pit us against each other, and dangle money in front of us, knowing full well they have the upper hand.” Furthermore, the restriction on online platforms makes it more difficult for sex workers to access community, safety resources and information.
While the OSA does not place a full-on ban on online advertising in the same way that legislation like FOSTA/SESTA has done, it is as if we are seeing the shadows of its impacts. At the same time, there are increasing political pushes by policymakers to ban adult services websites and online advertising entirely. The OSA is an omen of what the effects of this might be – increased poverty, greater desperation, higher levels of exploitation and reduced autonomy and independence.
Sex workers are no strangers to having to hand over personal data in order to access support or the tools they require to work. Any sex worker who advertises online, creates online content, or works in the porn industry, will have had to provide some form of ID to be able to work. Others have to provide it to access services such as medical care, or in legalised systems, to the state to work within the law. Sex workers’ data has so often been treated callously by data holders, leading to severe consequences from individuals being outed through employment checks to a systematic blocking of sex workers at borders – even when they have never engaged in work which breaks the laws in either their home country or the one they are trying to enter. There is no guarantee that data handed over under the OSA is safe or secure, and there are legitimate concerns about how sensitive data such as the adult websites people access may become targets for malicious actors. At the same time, when sex workers have raised concerns about data privacy for years, no one has listened.
Young People
At Basis Yorkshire, we do support online safety for young people and are acutely aware of its dangers particularly concerning interacting with strangers online. However, we do question the execution of these new verification systems and some of the content chosen to be censored.
There have been multiple reports of people bypassing certain age verification systems by using video game avatars. This brings to the forefront how well tested these verification systems are. VPN (Virtual Private Network) subscriptions have seen a spike, and with many being free and embedded within internet browsers, young people are still able to access a variety of adult content. The fact that some content is now considered ‘forbidden’ for young people may encourage young people to explore riskier corners of the internet, especially now that VPNs disguise your location online, and could open up young people to less-regulated corners of the internet.
On the other side, before the launch of the OSA, in 2022 Ofcom reported that only 17% of young people reported harmful content online. Now with the OSA in place, but content still being accessible via VPNs, some young people may feel even less inclined to report, as they may fear ‘getting in trouble’ for bypassing the OSA.
Most controversial is the blocking of non-sexual and non-harmful content, such as content around menstruation and female reproduction, LGBTQ+ forums and content around support groups for addiction. A lot of isolated young people (some isolated because of their identity) experience shame and often turn to the internet for support and reassurance that they are not alone. Being unable to access basic information on some of these topics could potentially contribute to poor mental health, which could be exacerbated if familial relationships are strained, or they generally lack a support system in their lives.
Some young people may turn to online spaces in search of guidance and acceptance, or just generally be more willing to talk to people online who offer understanding of their situation and their struggles with sense of identity and self-worth. We know perpetrators seek out this vulnerability, leading to grooming and exploitation and all the violence and trauma this entails.
The OSA also does not guarantee a shutdown of all harmful content, as many young people access their content via apps, such as TikTok and Instagram. While the OSA now puts responsibility on companies to keep their users safe, the minimum age to access such apps is around 13 years old, meaning you do not need to have ID to create and use an account, and could potentially lie about your age. This then relies on companies to monitor and take down unsafe or inappropriate content, which is a massive task given the sheer volume of content shared on these platforms.
With regards to pornography in particular, the reasons as to why young people seek out this content also fail to be taken into account. Young people are reliant on information charities or PSHE which is often poorly resourced and focuses more on judgment and rarely refers to positive sexual content. While pleasure is a significant component of seeking out content, young people also report using pornography to explore their sexual or gender identity, to fill gaps in knowledge missing from the sex education they had received, or as a way of communicating, or to understand their own desires. It is vital that RSE guidance goes beyond just discussing the harms of pornography and seeks to meet young people where they are, understand the gaps in their knowledge and the role of porn within their lives, and seeks to mitigate these in non-judgemental and non-stigmatising ways.
Social media platforms do ban certain search terms that relate to harmful content, but this is evaded through purposely misspelling terms, or by new terminology being invented by younger generations to bypass these bans. More open conversation and education around not only social media, but also around the topics that are deemed harmful (such as grooming, exploitation, eating disorders and mental health) are an important part of guiding and protecting young people.
Conclusion
Sex work (and pornography in particular) has frequently been referred to as the “canary in the coal mine” in relation in digital rights, online safety and censorship, and the OSA is just one example of many times this has happened throughout history. Unfortunately the story is the same time and time again – placing a good-looking plaster while ignoring deep-rooted issues which these policies fail to address. The end result is the same – young people fail to be adequately protected, and sex workers suffer unnecessarily.
We are supportive of moves which will improve the safety and wellbeing of children online, but are concerned that the OSA in its current form will not do this, and will cause greater harm. According to the Mental Health Foundation, 68% of young people have encountered harmful content online, and this is unacceptable. However, there needs to be more consideration of the effectiveness of age verification systems, what content is regulated, and further education on online safety, media literacy and exploitation for young people and the wider public more broadly.
It is vital that sex workers are involved in conversations about online safety and data privacy, and that parents and young people who have experienced online exploitation are provided with well-funded, evidence-based support services. Education on social media, sex, relationships, pornography, consent and exploitation must be improved, but should be based in evidence, not ideology. If age verification is not going away, there must be strict and stringent rules for verification providers around data privacy and security to ensure that everyone’s data is secure – including from the government itself.
