shield arrow-simple-alt-top arrow-simple-alt-left arrow-simple-alt-right arrow-simple-alt-bottom facebook instagram linkedin medium pinterest rss search-alt shaper twitter video-play arrow-long-right arrow-long-left arrow-long-top arrow-long-bottom arrow-simple-right arrow-simple-left arrow-simple-bottom readio arrow-simple-top speaker-down plus minus cloud hb pin camera globe cart rotate Group-35 star edit arrow-top arrow-right arrow-left arrow-bottom check search Close square speaker-up speaker-mute return play pause love new-tab equalizer
Back to Stories

Not Swift Enough: The battle to remove Taylor Swift deepfakes and live-streamed child sexual abuse

Share

Not Swift Enough: The battle to remove Taylor Swift deepfakes and live-streamed child sexual abuse

Late last month, sexually explicit deepfake images of superstar Taylor Swift exploded across the internet. IJM's John Tanagho explains why industry actions to address child sexual exploitation material, made with or without the use of AI, pale in comparison.

February 8th 2024

John Tanagho, Executive Director, IJM Center to End Online Sexual Exploitation of Children

Late last month, sexually explicit deepfake images of Super Bowl-bound Taylor Swift exploded across the internet with tens of millions of views. One image was seen 47 million times before X removed it 17 hours later.

While clearly not fast enough, the response was strong. X said it was "actively removing" the images and taking "appropriate actions" against the accounts involved in spreading them. The name "Taylor Swift" was temporarily unsearchable on X, alongside "Taylor Swift AI" and "Taylor AI."

U.S. politicians called for “new laws to criminalise the creation of deepfake images.” Microsoft reported that it strengthened “existing safety systems to further prevent our services from being misused to help generate images like them” in the first place. Ensuing media coverage was intense.

Yet, broadly speaking, industry actions to address the rampant creation of images, recorded and live videos of real children being sexually abused, with or without the use of AI, pales in comparison.


For example, according to Australia's eSafety Commissioner, tech companies reported that they do not detect, prevent, or address child sexual abuse material (CSAM) created and distributed “live” in video calls and livestreams.

In fact, a 2020 study by the International Justice Mission (IJM) found that children in the Philippines were sexually abused and exploited online for at least two years, on average.

Another IJM study released in 2023 found that nearly half a million children in the Philippines were exploited in this way in 2022 alone. These children are abused in their homes while men, including people in the UK, around the world pay to direct and consume it live using the same popular social media and video chat apps we all use every day.

Read the 'Scale of Harm' study

John Tanagho at Scale of Harm Launch, 2023
IJM has a robust programme that, to date, has supported Philippine authorities to bring to safety over 1,200 victims and at-risk individuals, while supporting the government and key stakeholders to holistically strengthen the response to these crimes.

And we will continue that critical work until Filipino children are protected from this violence.

In fact, since 2022 through Project Boost, IJM’s Center to End Online Sexual Exploitation of Children has—in partnership with the U.S. National Center for Missing & Exploited Children (NCMEC) and Meta—begun training law enforcement in other countries to investigate cases of online sexual exploitation of children, including in Nigeria, Kenya, and elsewhere. We’re sharing our proven model from the Philippines to support other governments to protect their children too.

But to help reduce the massive scale of this harm, the private sector must do more to address these crimes happening on and through their apps and platforms.

That “more” includes building their platforms safe by design to prevent harm, while also supporting efforts by organizations like IJM to strengthen law enforcement capacity to successfully investigate priority reports, bring children to safety, and hold offenders accountable.

Why should this matter to people in the UK?

Because according to research from 2023 by the Philippine Anti-Money Laundering Council, payments flagged by the financial sector as “suspicious transactions” for child sexual exploitation in the Philippines chiefly originated from the UK, United States, Australia and Canada.

We might feel overwhelmed or confused about how to protect people – children and adults – from hands-on and AI-generated exploitation, but there's hope. Numerous common-sense bills are pending in U.S. Congress, waiting for politicians to vote on how the U.S. will rise to the challenge of child protection online.

With this legislative backdrop, on Jan. 31, the U.S. Senate Judiciary Committee received hours of testimony from five major tech CEOs, demanding that they do more to protect children from abuse online and stem the tide of CSAM.

What the hearing made abundantly clear is that U.S.-based multinational tech platforms are not built safe by design at their core; and to become safe, the industry as a whole needs U.S. laws now.

Deepfake pornography that harms everyday citizens and celebrities like Taylor Swift, AI-generated CSAM, and livestreamed child sexual abuse are all prime examples for why Congress must require companies to embed safety into their products before they roll out technology.

Because safety by design is not mainstream across the tech sector, popular video call and livestreaming apps are easily weaponized by sex offenders to livestream child sexual abuse – and much more could be done to identify them and hold them accountable.

In summary, all companies should use safety technology to prevent as much online exploitation as possible, while timely and robustly reporting harm when it happens so law enforcement can do its job.
Taylor Swift Credit: Rolling Stone Magazine
Legislation to protect people – whether celebrities like Taylor Swift or ordinary citizens – from harmful deepfake images is simply a must.

Protecting all children and survivors from the ongoing trauma of sexual abuse and exploitation, including CSAM, is past due and urgently needed to ensure a strong industrywide response.

The time for change is now.

Learn how you can support IJM's work combating online sexual exploitation and abuse

You might also be interested in…

About Stories see more
MPs, campaigners, survivors and UK NCA urge action to tackle growth in livestreamed child abuse
Crystal: a shining example of hope after trafficking and abuse

Hope is infectious. Four years ago, Crystal* found hope when she was brought to safety from online sexual exploitation, alongside 10 other children. Today, with support from IJM and people like you, Crystal is advocating to stop this crime and giving hope to more children trapped in abuse.

“I want to let others know how I endured my past.” - Zoey's story

Zoey* overcame child trafficking and abuse. Today, she's passionate about sharing her story to help protect more children.

1 in 100 Filipino children trafficked to produce new child sexual exploitation materials in 2022

IJM and the University of Nottingham Rights Lab release first of its kind prevalence study.

You Can Help Send Rescue Today.

When you give a gift today, you’ll be fighting slavery, violence, and injustice across the globe. Together, we can end slavery and violence in our lifetime.

You can make the most impact as a Freedom Partner today.

Your generous monthly support will help send rescue to vulnerable children and families at a moment’s notice, stand with them as they rebuild their lives in freedom and have perpetrators held accountable.

Login

Donor Portal

Review your giving, tax statements and contact info via the IJM Donor Portal.

please sign in
Email Sign Up
Get updates from IJM on stories from the field, events in your area and opportunities to get involved.
sign up