16 Days of Activism Against Gender Based Violence 2025: Image Based Abuse

Content warning. This blog will have discussions on image-based abuse, sexual abuse and deepfake pornography.

The 10th of December marks International Human Rights Day and the end of the annual campaign 16 Days of Activism Against Gender Based Violence. Since being established in 1991, it has significantly grown in reach and importance as 1 in 3 women will experience gender-based violence in their lifetime. Violence against women and girls is a global emergency that needs addressing.

This year’s campaign focus is online abuse, highlighting the escalating levels of harm taking place in digital spaces and the urgent need for safety, justice and accountability. Online spaces were originally seen as a place which could empower women, girls and gender minorities. Yet, digital platforms have instead become a field of abuse, harassment and surveillance. Between 16-58% of women and girls have reported experiencing digital violence, reflecting the deeply entrenched gender inequalities that exist within society.

Image based abuse, including deepfake pornography, is growing rapidly and will be discussed throughout this blog.

What is Image Based Abuse?

Image based abuse is an umbrella term for any act where intimate or private images are created, shared or threatened to be shared without consent. It is a type of sexual violence.

It is often referred to as ‘revenge porn’, yet this term can lead to harmful misconceptions and narratives.

  • It suggests that the survivor has done something to deserve ‘revenge’
  • It frames sexual violence as a relationship dispute rather than a violation of consent
  • It is a harmful rhetoric to sex workers, as it conflates pornography with abuse. Whilst this can coexist, they are not the same

Image based abuse can occur through social media, messaging apps, pornography sites, subscription platforms and more. Perpetrators often use images to shame, control or extort victims. The impact can be devastating:

  • Fear for personal safety
  • Emotional trauma
  • Stigma
  • Professional consequences
  • Long-term threats to wellbeing

Image based abuse is not a one-time occurrence. Every time the image is viewed or shared, the violation begins again.

Deepfake pornography

Deepfake pornography is a rapidly growing form of digital abuse, facilitated by the creation of AI. It involves generating fake sexual images or videos using AI, placing someone’s face onto another person’s body. The people targeted are not consenting, and the images used are becoming alarmingly real.

Between 2019-23, the number of deepfakes increased by 550%. Gender plays a significant role in who is targeted for deepfakes, with 99% of deepfake porn depicting women.

Deepfake tools are built on unequal algorithms. Many are not designed to work on male bodies. This bias is not accidental but a by-product of the misogynistic culture shaping AI development. Women and girls are placed directly in the firing line of this abuse. 

Impact on sex workers

Sex workers experience some of the highest levels of image-based abuse, yet this sexual violence is often overlooked.

For online sex workers, sharing images online is often a necessary part of the job to sell their services, yet it exposes them to great vulnerability of the sharing of non-consensual images.

The sharing of private images without consent removes people’s control over their privacy and undermines their autonomy. Additionally, it has negative consequences in their personal life, putting them at risk of stigma and judgment, physical and mental harm and other professional consequences. The high levels of stigma and discrimination that exist against sex workers make the consequences of image-based abuse potentially more severe, ranging from threats to housing, relationships, safety and income.

Image based abuse is a widespread problem, and whilst greater awareness is being drawn to the growing threat it poses, often little attention is paid to the violence and implications image-based abuse, and surveillance have towards sex workers.

Deepfake videos often use the bodies of sex workers without credit, payment or consent. This is a form of digital exploitation layered onto existing inequalities.

The UK law

Under current UK legislation, the Online Safety Act 2023 made it a criminal offence to share or threaten to share intimate images without consent. It also placed new duties on social media platforms to remove this material and prevent it from reappearing. These were important and long-overdue steps toward recognising image-based abuse as a serious form of sexual violence.

However, significant gaps remain, particularly around deepfakes. Despite the severity of the harm they cause, it is still not explicitly illegal to create sexually explicit deepfakes of adults, whether or not the person depicted has consented. This leaves survivors exposed to one of the fastest-growing forms of sexual abuse.

Enforcement of existing laws also remains inconsistent. Response times from platforms are slow, police action varies widely across the country, and many survivors struggle to get images removed or to access meaningful justice.

If you need support

Experiencing image-based abuse can feel overwhelming, isolating and frightening, but it is important to remember that you are not alone; support is available. Removing images, reporting perpetrators and accessing emotional support should not be a journey one undertakes on their own.

There are several organisations in the UK that can help navigate these challenges, two of which are listed below.

Revenge porn Helpline offers a free, confidential advice and specialist support service for those over the age of 18 who has experienced non-consensual sharing of intimate images.

Trusted Flagger scheme offers the immediate take down of online content across all Aylo platforms (PornHub, YouPorn, RedTube, GayTube, Peeperz and PornMD).

What Basis Yorkshire can support with

At Basis Yorkshire, we offer support sex workers who identify as women or non-binary to end stigma, promote empowerment and create safety.  

We understand that image based abuse is not only a digital issue, but a form of sexual violence that affects all aspects of survivors’ lives. We offer non-judgemental, person-centred and confidential support, and help achieve what justice looks like for you.

We recognise that online violence is not gender specific, but women, girls and gender minorities are disproportionately targeted, and less protected. Safety is not a privilege; it is a human right. We call for stronger legislation and greater education around consent and relationships to make the online world a safer space for all.

No one should face this alone, nor have their autonomy stolen in digital or physical spaces. Together, we can create an online world where consent, safety and dignity are respected.

 If you or someone you know needs support, please contact us:

Call: 0113 243 0036

Email: info-basis@basisyhorkshire.org.uk

Or you can make a Referral if the waiting list is open.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>