(Content warning: This story features descriptions of an ad that may be triggering or upsetting to some readers, as it is based on real accounts of child abuse survivors.)
The Canadian Centre for Child Protection (C3P) has developed a new video as part of a campaign to shock tech companies into being more proactive when it comes to removing the millions of images and videos of child sexual abuse material.
The spot depicts a predator and a young victim in a family room, but the story continues from there, showing her through her teen years and into adulthood, showing how the presence of images of her abuse online have followed her throughout her life.
The ad is appearing on YouTube, and is being shared through channels like Instagram and LinkedIn.
[iframe_youtube video=”G0BkdY5OvNs”]
Last year, C3P and agency partner No Fixed Address used Twitter’s 15th anniversary as an opportunity to call out the platform’s lacklustre track record on preventing child abuse. According to Lianna McDonald, executive director of C3P, expanding the scope to the technology sector more broadly is because of its “woefully inadequate” response to removing abusive images of children. The goal is to “demand more” of them to protect children and future generations online.
Trent Thompson, VP and CD at NFA, says the sharing of child sexual abuse material (CSAM) online is a multifaceted problem, and this particular video focuses in on a specific element as an education piece to spur legislative change. When compromising material is shared again online, there’s a re-victimization, and that’s a story that’s not often told, he says.
“When tech fails to remove the abusive images and videos, these crimes aren’t just memories – they can remain online for years,” Thompson says. “Haunted by the visual of survivors unable to escape their torment, we aimed to create a small glimpse into their experience for the viewers to have a better understanding and rally together to demand change.”
The spot directs viewers to a microsite demanding action from both tech companies and lawmakers. The campaign’s timing coincides with the five-year anniversary of Project Arachnid, a tool operated by the C3P that detects known images of child sexual abuse material and issues removal notices.
Over the last five years, six million images and videos of CSAM have been removed from the internet through Project Arachnid. But the rapid growth of online platforms that feature user-generated content has made the distribution of CSAM easier than ever, growing at an exponential rate.
Before being released to the public, the film was aired at the G7 Summit in London as a call to action for world leaders to hold tech industry leaders accountable in the fight against CSAM. International response from the showing brought about the decision to share this creative with the broader public, Thompson says.
“It’s a big initiative and a lot of favours were called in to make this, because everyone recognized how important a story it is,” Thompson says.