{"id":35922,"date":"2025-12-04T06:08:37","date_gmt":"2025-12-04T06:08:37","guid":{"rendered":"https:\/\/naijaglobalnews.org\/?p=35922"},"modified":"2025-12-04T06:08:37","modified_gmt":"2025-12-04T06:08:37","slug":"the-rise-of-deepfake-pornography-in-schools-one-girl-was-so-horrified-she-vomited-deepfake","status":"publish","type":"post","link":"https:\/\/naijaglobalnews.org\/?p=35922","title":{"rendered":"The rise of deepfake pornography in schools: \u2018One girl was so horrified she vomited\u2019 | Deepfake"},"content":{"rendered":"<p>\n<\/p>\n<p class=\"dcr-130mj7b\">\u2018It worries me that it\u2019s so normalised. He obviously wasn\u2019t hiding it. He didn\u2019t feel this was something he shouldn\u2019t be doing. It was in the open and people saw it. That\u2019s what was quite shocking.\u201d<\/p>\n<p class=\"dcr-130mj7b\">A headteacher is describing how a teenage boy, sitting on a bus on his way home from school, casually pulled out his phone, selected a picture from social media of a girl at a neighbouring school and used a \u201cnudifying\u201d app to doctor her image.<\/p>\n<p class=\"dcr-130mj7b\">Ten years ago it was sexting and nudes causing havoc in classrooms. Today, advances in artificial intelligence (AI) have made it child\u2019s play to generate deepfake nude images or videos, featuring what appear to be your friends, your classmates, even your teachers. This may involve removing clothes, getting an image to move suggestively or pasting someone\u2019s head on to a pornographic image.<\/p>\n<p class=\"dcr-130mj7b\">The headteacher does not know why this particular girl \u2013 a student at her school \u2013 was selected, whether the boy knew her, or whether it was completely random. It only came to her attention because he was spotted by another of her pupils who realised what was happening and reported it to the school.<\/p>\n<p class=\"dcr-130mj7b\">The parents were contacted, the boy was traced and the police were called in. But such is the stigma and shame associated with image-based sexual abuse and the sharing of deepfakes that a decision was made that the girl who was the target should not be told.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThe girl doesn\u2019t actually even know,\u201d the head said. \u201cI talked to the parents and the parents didn\u2019t want her to know.\u201d<\/p>\n<p><span class=\"dcr-1inf02i\"><\/span> Composite: Guardian Design; Alistair Berg\/Getty Images<\/p>\n<p class=\"dcr-130mj7b\">The boy on the bus is just one example of how deepfakes and easily accessed nudifying technology are playing out among schoolchildren \u2013 often to devastating effect. In Spain last year, 15 boys in the south-western region of Extremadura were sentenced to a year\u2019s probation after being convicted of using AI to produce fake naked images of their female schoolmates, which they shared on WhatsApp groups. About 20 girls were affected, most of them aged 14, while the youngest was 11.<\/p>\n<p class=\"dcr-130mj7b\">In Australia, about 50 high school students at Bacchus Marsh grammar in Victoria reported that their images had been faked and distributed \u2013 the mother of one student said her daughter was so horrified by the sexually explicit images that she vomited. In the US, more than 30 female students at Westfield high school in New Jersey discovered that deepfake pornographic images of them had been shared among their male classmates on Snapchat.<\/p>\n<p class=\"dcr-130mj7b\">It\u2019s happening in the UK, too. A new poll of 4,300 secondary school teachers in England, carried out by Teacher Tapp on behalf of the Guardian, found that about one in 10 were aware of students at their school creating \u201cdeepfake, sexually explicit videos\u201d in the last academic year.<\/p>\n<p class=\"dcr-130mj7b\">Three-quarters of these incidents involved children aged 14 or younger, while one in 10 incidents involved 11-year-olds, and 3% were younger still, illustrating just how easy the technology is to access and use. Among participating teachers, 7% said they were aware of a single incident, and 1% said it had happened twice, while a similar proportion said it had happened three times or more in the last academic year.<\/p>\n<p class=\"dcr-130mj7b\">Earlier this year, a Girlguiding survey found that one in four respondents aged 13 to 18 had seen a sexually explicit deepfake image of a celebrity, a friend, a teacher or themselves.<\/p>\n<p class=\"dcr-130mj7b\">\u201cA year ago I was using examples from the US and Spain to talk about these issues,\u201d says Margaret Mulholland, a special needs and inclusion specialist at the Association of School and College Leaders. \u201cNow it\u2019s happening on our doorstep and it\u2019s really worrying.\u201d<\/p>\n<p>The police seem completely overwhelmed by the scale of these issues. We need broader solutions, and better strategy<\/p>\n<p class=\"dcr-130mj7b\">Last year the Times reported that two private schools in the UK were at the centre of a police investigation into the alleged making and sharing of deepfake pornographic images. The newspaper said police were investigating claims that the deepfakes were created at a boys\u2019 school by someone manipulating images taken from the social media accounts of pupils at a girls\u2019 school.<\/p>\n<p class=\"dcr-130mj7b\">The children\u2019s commissioner for England, Dame Rachel de Souza, has called for nudification apps such as ClothOff, which was investigated as part of the Guardian\u2019s Black Box podcast series about AI, to be banned. \u201cChildren have told me they are frightened by the very idea of this technology even being available, let alone used,\u201d she says.<\/p>\n<p class=\"dcr-130mj7b\">It\u2019s not easy to find teachers willing to speak about deepfake incidents. Those who agreed to be interviewed by the Guardian insisted on strict anonymity. Other accounts were provided by academics researching deepfakes in schools, and providers of sex education.<\/p>\n<p class=\"dcr-130mj7b\">Tanya Horeck, a professor of film and feminist media studies at Anglia Ruskin University, has been talking to headteachers as part of a fact-finding mission to uncover the scale of the problem in schools. \u201cAll of them had incidents of deepfakes in their schools and they saw this as an emerging problem,\u201d she says. In one case, a 15-year-old girl who was new to a school was targeted by male students who created a pornographic deepfake video of her. She was so distressed she initially refused to go to school. \u201cAlmost all the examples they told me about were boys making deepfakes of girls,\u201d says Horeck.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThe other thing that I noticed is that there\u2019s this real tension around how they should handle these issues. So some teachers were saying, \u2018Yeah, we just get the police in right away and students are expelled\u2019 \u2013 that kind of approach,\u201d Horeck says. \u201cThen other teachers were saying, \u2018Well, that\u2019s not the way to handle it. We\u2019ve got to have more of a restorative justice approach, where we\u2019re talking to these young people and finding out why they\u2019re doing these things.\u2019<\/p>\n<p class=\"dcr-130mj7b\">\u201cSo there seems to be some kind of inconsistency and uncertainty on how to deal with these cases \u2013 but I think it\u2019s really hard for teachers because they\u2019re not getting clear guidance.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Laura Bates, the founder of the Everyday Sexism Project, says there is something particularly shocking about deepfake images. In her book The New Age of Sexism: How the AI Revolution Is Reinventing Misogyny, she writes: \u201cOf all the forms of abuse I receive they are the ones that hurt most deeply \u2013 the ones that stay with me. It\u2019s hard to describe why, except to say that it feels like <em>you. <\/em>It feels like someone has taken you and done something to you and there is nothing you can do about it. Watching a video of yourself being violated without your consent is an almost out-of-body experience.\u201d<\/p>\n<p><span class=\"dcr-1inf02i\"><\/span><span class=\"dcr-1qvd3m6\">\u2018The fallout was significant.\u2019<\/span> Photograph: Nick David\/Getty Images<\/p>\n<p class=\"dcr-130mj7b\">Among school-age children, the impact can be huge. Girls and young women are left feeling violated and humiliated. School friendship groups are shattered and there can be a deep sense of betrayal when one student discovers another has created a deepfake sexualised image of them, and shared it around the school. Girls can\u2019t face lessons, while teachers with little training do their best to support and educate. Meanwhile, boys and young men are being drawn into criminal behaviour, often because they don\u2019t understand the consequences of their actions.<\/p>\n<p class=\"dcr-130mj7b\">\u201cWe do see students who are very upset and feel betrayed and horrified by this kind of abuse,\u201d says Dolly Padalia, the CEO of the School of Sexuality Education, a charity providing sex education in schools and universities. \u201cOne example is where a school got in touch with us. A student had been found to have taken images of lots of students within the year group and was making deepfakes.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThese had then been leaked, and the fallout was quite significant. Students were really upset. They felt very betrayed and violated. It\u2019s a form of abuse. The police were involved. The student was removed from school and we were asked to come in and support. The school responded very, very quickly, but I would say that\u2019s not enough. In order for us to really be preventing sexual violence, we need to be more proactive.\u201d<\/p>\n<p class=\"dcr-130mj7b\">It is estimated that 99% of sexually explicit deepfakes accessible online are of women and girls, but there are cases of boys being targeted. The charity Everyone\u2019s Invited (EI), which collects testimonies from survivors of sexual abuse, has run into at least one such case: \u201cOne student shared with the EI education team that a boy in their year group, who was well liked and friends with many of the girls, was targeted when another boy created an AI-generated sexual image of him. That image was then circulated around the school, causing significant distress and trauma.\u201d<\/p>\n<p class=\"dcr-130mj7b\">EI also flags how these tools are being trivialised and used in disturbing ways, such as the \u201cchanging your friend into your boyfriend\u201d filter. \u201cOn social media platforms like TikTok and Snapchat, they are increasingly accessible and normalised. While this may seem playful or harmless to some, it reflects and reinforces a culture where consent and respect for personal boundaries are undermined.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Against a backdrop of widespread misogyny in schools, a growing number of teachers are also being targeted, EI and others report: \u201cIt is something that, as a society, we urgently need to confront. Education has to stay in front of technology, and adults must feel equipped to lead these conversations rather than shy away from them.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Seth James is a designated safeguarding lead \u2013 a senior member of staff with overall responsibility for child protection and safeguarding within a school \u2013 and the author of the DSL Blog. \u201cFor everyone working in schools, it feels like new sets of challenges and risks are constantly being thrown up by technological developments,\u201d he says. \u201cAI generally \u2013 and particularly deepfakes and nudify apps \u2013 feel like the next train coming down the track.<\/p>\n<p class=\"dcr-130mj7b\">\u201c\u2018More education\u2019 is appealing as a solution to these sorts of challenges \u2013 because it\u2019s intuitive and relatively easy \u2013 but on its own it\u2019s like trying to hold back a forest fire with a water pistol. And likewise, the police seem completely overwhelmed by the scale of these issues. As a society we need broader solutions, and better strategy.\u201d<\/p>\n<p class=\"dcr-130mj7b\">He continues: \u201cWe should all try to imagine how we would have felt 20 years ago if someone had suggested inventing a handheld device which could be used to create realistic pornographic material that featured actual people that you know in real life. And then they\u2019d suggested giving one of these devices to all of our children. Because that\u2019s basically where we are now. We\u2019re letting these things become \u2018normal\u2019 on our watch.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Jessica Ringrose, a professor of sociology of gender and education at University College London\u2019s Institute of Education, has worked in schools on issues including masculinity, gender inequality and sexual violence. She is also co-author of a book called Teens, Social Media, and Image Based Abuse, and is now researching tech-facilitated gender-based violence.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThe way that young people are using these technologies is not necessarily all bad,\u201d she says, \u201cbut what they need is better media literacy.\u201d She welcomes the government\u2019s updated relationships, sex and health education guidance, which \u201crecognised that misogyny is a problem that needs to be tackled in the school system\u201d. However, she says: \u201cThey need to put the dots together. They need to join up a concern with gender and sexual-based violence with technology. You can\u2019t rely on Ofcom or the regulators to protect young people. We need proactive, preventive education.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Where is the government in all this? \u201cOur new relationships, sex and health education guidance will make sure that all young people understand healthy relationships, sexual ethics and the dangers of online content such as pornography and deepfake,\u201d a Department for Education spokesperson said. \u201cAs part of our Plan for Change mission to halve violence against women and girls, we are also providing schools with new funded resources to help teachers explain the law and harms relating to online content as part of their age-appropriate lessons.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Ringrose stresses the urgency. \u201cThese issues are happening \u2013 non-consensual creation and distribution of images is happening. These technologies are at people\u2019s fingertips. I mean, it\u2019s super-easy for any kid to access these things.\u201d<\/p>\n<p class=\"dcr-130mj7b\">She is sceptical about efforts to ban smartphones in schools and worries they will make it harder for young people who may be targeted with abusive imagery to seek help. \u201cAbstinence around things like technology doesn\u2019t work,\u201d she says. \u201cYou actually have to teach people to use it properly. We need to engage with this as a really important element of the curriculum.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Which takes us back to the boy on the bus, where this story began. He was stopped because a girl on the same bus had recently had a lesson in school about online safety as part of her PSHE (personal, social, health and economic) curriculum. She recognised what he was doing and told her teachers.<\/p>\n<p class=\"dcr-130mj7b\">Education works.<\/p>\n<p class=\"dcr-130mj7b\"><em><span data-dcr-style=\"bullet\"\/> In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In the US, call or text the Childhelp abuse hotline on 800-422-4453. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International<\/em><\/p>\n<p class=\"dcr-130mj7b\"><em><strong><span data-dcr-style=\"bullet\"\/> Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.<\/strong><\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>\u2018It worries me that it\u2019s so normalised. He obviously wasn\u2019t hiding it. He didn\u2019t feel this was something he shouldn\u2019t be doing. It was in the open and people saw it. That\u2019s what was quite shocking.\u201d A headteacher is describing how a teenage boy, sitting on a bus on his way home from school, casually<\/p>\n","protected":false},"author":1,"featured_media":35923,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[57],"tags":[14720,1935,13283,6867,313,588,20159],"class_list":{"0":"post-35922","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-education","8":"tag-deepfake","9":"tag-girl","10":"tag-horrified","11":"tag-pornography","12":"tag-rise","13":"tag-schools","14":"tag-vomited"},"_links":{"self":[{"href":"https:\/\/naijaglobalnews.org\/index.php?rest_route=\/wp\/v2\/posts\/35922","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/naijaglobalnews.org\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/naijaglobalnews.org\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/naijaglobalnews.org\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/naijaglobalnews.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=35922"}],"version-history":[{"count":0,"href":"https:\/\/naijaglobalnews.org\/index.php?rest_route=\/wp\/v2\/posts\/35922\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/naijaglobalnews.org\/index.php?rest_route=\/wp\/v2\/media\/35923"}],"wp:attachment":[{"href":"https:\/\/naijaglobalnews.org\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=35922"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/naijaglobalnews.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=35922"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/naijaglobalnews.org\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=35922"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}