Cara Hunter, an SDLP MLA who was targeted by a deepfake video, describes it as “deeply humiliating.” However, she says this campaign is about “shifting the shame and the blame.”
MLA Hunter says, “For me, I started the campaign due to the level of distress that I had experienced during my election, which was unfathomable. I have always said I have never been the same person since, and it really lowered my confidence. The idea that children are now experiencing the same thing.”
Hunter emphasises the deeply unsettling fact that years ago, predators would ask for sexual images of children, and now they can create them.
“As an elected representative, I want to ensure that, most importantly, our children are safe. 98% of deepfakes are of women, I think that speaks to the gender-based tech-facilitated violence that we’re seeing.” She continues, “so for me as a legislator, this is the stuff that I really think is important.”
Police warn of rising threat from sexual deepfakes. A new police-commissioned survey, published 24 Nov 2025, looked at people’s attitudes towards deepfakes and more specifically, those that are sexual or intimate in nature and disproportionately target women and girls. This type of content is thought to have increased in prevalence by 1,780% between 2019 and 2024.
Three in five people said they are very or somewhat worried about becoming a victim of a deepfake.
Campaign UN Women says, “We already live in a world where at least one in three women experience physical or sexual violence. Enter a host of extremely powerful AI tools, trained on existing gender-biased data, now enabling that violence to spread further, faster and in more complex ways. It’s a perfect storm.”
Hunter says this is a very unique situation. She wants to see robust laws and, most importantly, a sense of hope to bring ethics education on this technology into schools to prevent cyberbullying.
“For the first time, I am filled with hope that we can get this law over the line.”