It often seems that with every technological breakthrough humanity makes, there soon exists a way for it to be used to hurt people.

AI is the new tech buzzword on everyone’s screens, and it didn’t take long for people to start using it to create sexualised images of people in an attempt to humiliate, violate and hurt others, particularly women and children.

Image-based sexual abuse is sadly neither new nor rare, it’s something that has plagued our society since we first started carrying around cameras everywhere we go. It refers to the distribution of sexual material without the consent of the person to whom the material belongs, or the threat of sharing such material.

Often when sexually explicit imagery gets leaked, it is the victim who gets blamed for taking them, having them, or sending them. In a paper published by the University of Wolverhampton for the Journal of Psychosocial Research on Cyberspace, the concept of victim blaming is examined through the context of the digital world. The study outlines that due to the sexual nature of the crime, “This cognitive bias leads to the assumption that the crime has befallen a victim as a morally fair consequence of their own actions or vice versa and that bad things only happen to bad people.

“Thus, if someone falls victim to “revenge porn” because they either shared or stored SEM (Sexually Explicit Media), they are viewed to have brought those actions upon themselves and will ultimately be blamed therefore.”

The Herald: Taylor SwiftTaylor Swift (Image: free)

Despite it being highly illegal to distribute sexual images of someone without consent, the response often given to people who speak out about their experiences is that victims of image-based sexual abuse bring it on themselves. If only they hadn’t taken nude photos of themselves, there wouldn’t be anything to leak, if only they hadn’t trusted a partner, or a password, or a secure server, the crime wouldn’t have happened.

Going to extreme lengths to blame those who have had their privacy violated shifts the focus from perpetrator to victim, and ascribes responsibility to the wrong party.

With the ever-developing and poorly regulated world of AI, image manipulation for the creation of “deep fake porn” has become yet another way to abuse women and children online. The ability to use AI to create and distribute pornography is something which can and should be regulated.

Due to the recent incident of people making and spreading faked nude photos of Taylor Swift, within days a bipartisan collective within congress introduced a bill, labelled the Defiance Act, or Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, which would impose a penalty upon anyone who creates, possesses or distributes non-consensual explicit and sexual imagery.

Many people are calling for Taylor Swift and other high-profile female targets of AI-generated sexual images to pursue the culprits legally, even demanding that they act out of responsibility to other people affected by image-based sexual abuse who aren’t in a financial position to take legal action.

This reaction begs an interesting question about the ways in which society views women with power and money who are targeted sexually as a vehicle for change, and not solely as one of many victims of an incredibly strange and horrible act designed to hurt and violate them.

Swift has already been named as a “silence breaker” as part of Times magazines person of the year edition for her role in a two-year battery and sexual assault court case. She asked for only $1 in damages and was able to secure a judgement in her favour, successfully proving she had been sexually assaulted.


Read more: Poverty in Scotland: people should come before profit

Read more: Lennie Pennie: Men have been made to feel as though their voice won't be heard


Unfortunately, Taylor Swift’s case is a rarity as she is in a unique position to effect change, possessing almost limitless resources, a great deal of visibility, and the unwavering support of millions of people, which she recognised in a statement following the verdict, saying, "I acknowledge the privilege that I benefit from in life, in society and in my ability to shoulder the enormous cost of defending myself in a trial like this."

While it is becoming increasingly clear that people with power and privilege can and do effect rapid change within the legal system, and while this can be a great way of quickly making an impact as can be seen with the swift and decisive action from congress, it should not take the targeting of a high-profile celebrity to make politicians and lawmakers act decisively and constructively.

It’s clear that social media platforms have the capacity to limit or entirely prevent the sharing of such images as within hours of the images being shared on X, the site had restricted the ability to search for the terms associated with the incident, and had removed many of the explicit images, bypassing the reporting process, which usually takes a lot longer, and isn’t always successful.

The Herald: Mia JaninMia Janin (Image: free)

That it has taken such high-profile coverage to inspire lawmakers to act is frustrating, as the use of online tools to create sexualised imagery, even of children, has already proven to have potentially devastating consequences. Mia Janin, from London, was only 14 when pupils at her school reportedly created a snapchat group with over 60 members as part of an extended campaign of bullying.

Photos of Mia and other girls were allegedly combined with explicit imagery taken from pornography and distributed throughout the group, according to students interviewed about the bullying. Mia unfortunately died by suicide in March of 2021, and her father Mariano has called for police to set up a specific taskforce to target cyberbullying within schools.

The way in which both politicians and social media platforms have responded to image-based sexual abuse and AI-generated explicit imagery when the victim is someone who is in the public eye is shocking, but not surprising. Deepfake pornography and image-based sexual abuse are neither new nor rare, and a great many people have been coming forward for years to call for a response at both a legal and societal level.

We must do more to ensure that regardless of who the victim is, or what power they have access to, that they are afforded the same level of dignity, and decisive action by legislators and those who have the influence to improve and reform legal systems.