Online child sexual abuse exists because it is allowed to exist. Social media companies must act – Professor Debi Fry

An Instagram prompt warned users they may be about to see child sexual abuse material, but then asked if they would like to ‘see the results anyway’

Who would deny that Taylor Swift is one of the most successful singer-songwriters of all time? Her latest album, 1989 (Taylor's Version), broke a record for vinyl sales in the 21st century, becoming her sixth to sell over a million first-week copies in the US alone.

But here’s an even bigger number to ponder: 47 million. It’s the number of times people reportedly viewed a sexually explicit deepfake image falsely purporting to be of the musician before it was finally removed last weekend by X, the social media platform formerly known as Twitter.

Hide Ad
Hide Ad

Sadly, Taylor’s ordeal isn’t an isolated case. There have been so many other victims of the creation and sharing of such non-consensual deepfake sexual images, many of them children among her fanbase. And because many lack the means to act, these can be left to languish for months or even years online, retraumatising them again and again.

This next statistic may shock you even more: 32 million. It is the number of reports last year of online sexual images of children who fell victim to any form of sexual exploitation and abuse, reports filed to watchdog NCMEC by companies like X, Facebook, Instagram, Google, WhatsApp and members of the public. Be in no doubt: we are in the grip of another global pandemic, and one that needs to be tackled urgently as a public health emergency.

I hope the big tech giants can agree because some of the evidence presented at last week’s US Senate judiciary committee investigating child safety on the internet was almost as startling as some of these numbers. At one point, Ted Cruz, the Republican Senator, asked Meta’s Mark Zuckerberg “what the hell were you thinking?” about an Instagram prompt that warned users they may be about to see child sexual abuse material, but then asked if they would like to “see the results anyway”.

Mr Zuckerberg’s apology to families whose lives have been torn apart by the worst of social media is welcome. Tougher action would be infinitely more so, especially at a time when the decision to make end-to-end encryption the norm on file-sharing apps like his for images and videos runs the risk of putting the privacy rights of an abuser above the privacy and safety rights of a child.

Last year saw the launch of the Childlight Global Child Safety Institute, based at the University of Edinburgh and supported by the Human Dignity Foundation. Our mission is to safeguard children around the world through the illuminating power of data, insights and partnerships.

The data we are gathering helps better understand the nature of child sexual abuse and exploitation. Meanwhile, the world’s first global prevalence index, which we publish next month, will highlight the scale of the crisis – one that is very real in every part of the world and has grown exponentially in recent years as people spend more time online.

That understanding should better inform the decision-makers formulating responses to tackle a problem that is transnational. An abuser in, say Scotland, can electronically transfer funds to another abuser to perpetrate an atrocity in a second country, with the files stored in a data centre in a third country and within a matter of minutes shared around networks in dozens of other countries.

This, in turn, increases demand for such content among new users and increases rates of abuse of children. And that may mean more perpetrators moving from online offending – like viewing child abuse material, engaging in sexually explicit webcam interactions with children or having sexual conversations online – to contact offending. Worryingly, many offenders are tech savvy and our research suggests many would do more than view child abuse material online if they thought they could get away with it.

Hide Ad
Hide Ad

We know that child sexual abuse is deeply harmful not only to children and young people but also to wider society. Evidence links it to poorer mental health, physical health (including chronic disease and early death), high-risk behaviours, negative educational outcomes, and under-employment. So it is, of course, much more than simply a crisis in terms of pounds and pence but it is interesting to note that the economic and social cost of contact child sexual abuse in England and Wales for victims has been estimated to be just over £10 billion in the year to 31 March 2019.

New technology-facilitated child sexual abuse threats are also emerging, ranging from online extortion to abuse in virtual settings, with British police recently launching an investigation into allegations of sexual assault in the metaverse.

There are many steps we can take to help. Prevention and education programmes that target children and parents, caregivers, educators, and people concerned about their own behaviour. Raising awareness of the signs of abuse and the importance of reporting it is also crucial.

The new Online Safety Act helps too, with its aim to make the UK the safest place in the world to be online, supporting preventative actions and recognising that increased regulation is central to making online environments less conducive to the activities of offenders. And so, to the tech companies – increasingly turning off the lights to avoid being held responsible for safety on their platforms by embedding total privacy into their technology, practices and infrastructures. Concerningly, the majority of online content-sharing service companies don’t even currently publish metrics showing the reach and views of child abuse content or how quickly they take it down.

Let’s be clear: online child sexual abuse exists because it is allowed to exist. So, we need stronger tech company accountability now – strengthening regulatory frameworks, metrics and gold standards for detection and response. Like that best-selling singer-songwriter, children deserve better – and they can’t wait.

Debi Fry is director of data at Childlight Global Child Safety Institute and a professor of international child protection research at the University of Edinburgh