FYI.

This story is over 5 years old.

Entertainment

Inside the Organisation That Rids the Internet of Child Sex Abuse

We visited the headquarters of the Internet Watch Foundation, whose analysts are tasked with searching out and identifying child sexual abuse images and videos.
CB_181130_79967dip
All photos: Chris Bethell

Until you reach the hotline room, there are few clues as to what's actually going on at the Internet Watch Foundation's Cambridgeshire headquarters.

The organisation's office has all the hallmarks of any other modern, nondescript workplace: a handful of awards in a cabinet in the lobby; a table tennis table in the break room; cheap Christmas decorations hanging over a few of the desks. But to reach the hotline room – the epicentre of the office – you've got to pass through a series of doors, deposit all electronic and recording equipment (phones included) into secure lockers, ring a bell and wait to be invited in.

Advertisement

With special legal dispensation to engage in illegal activity online, it makes sense that the 13 analysts on the other side of all that security need to do their work behind closed doors. Frankly, you wouldn't want to wander in accidentally and see what might be on their screens.

1545063765693-CB_181130_79830dip

The Internet Watch Foundation (IWF) was founded on the 10th of December, 1996, says Chris – the manager of the hotline – as he shows me around the office and into a meeting room. Two decades might have passed since then, but its core mission remains the same: to remove child sexual abuse images and videos from the internet. According to their most recent annual report, the IWF identified close to 80,000 URLs containing child sexual abuse imagery in 2017, up 37 percent from the 57,335 they removed the year before.

"We are the UK hotline for the identification, location and removal of child sexual abuse images," explains Chris, adding that this is one of 52 or 53 hotlines which do this sort of work worldwide. "The first thing we deal with are inbound reports. Members of the public refer websites and URLs to us. We then review the content and make a determination of whether or not we believe the content is within our remit and whether it's illegal."

1545063786159-CB_181130_79922dip

The team don't just deal with public reports; they also proactively search for this kind of content. Following the brutal murders of two British schoolchildren – Tia Sharp and April Jones – then-Prime Minister David Cameron held a summit at Downing Street in April of 2014 to find out what more could be done to protect children. Since then, analysts at the IWF have been using their knowledge to search through some of the darkest corners of the internet in an effort to find and remove child sexual abuse themselves.

Advertisement

With a Memorandum of Understanding agreed under the Sexual Offences Act, Chris and his team of highly-highly-trained analysts have legal protection to view this illegal content. "Once we've determined something is both illegal and in our remit, we categorise the image or video," he explains.

1545063817173-CB_181130_79959dip

Just as Chris is about to start explaining what these categories are, the door to our meeting room opens. Someone walks in with a tray of sandwiches. Chris pauses mid-sentence, waiting for her to leave. "Even when people work here, they can become quite distressed by some of the terminology," he tells me, once the door has closed behind her.

Without going into graphic details, Chris explains that images and videos of child sexual abuse are labelled either A, B or C. "Category A content really deals with rape and sexual torture of children," he says matter-of-factly. "Category B means non-penetrative sexual activity with a child, and/or solo masturbation." The category of the lowest severity is C: indecent images of children.

Once this grading is completed, using specialist software and techniques, the analysts locate the country in which the content is being physically hosted. Once this is done, the team work closely with law enforcement agencies internationally to ensure that what they've found is quickly investigated and taken down.

Chris has managed the work of the IWF hotline for six-and-a-half years, and in that time has seen an increase in the amount of child sex abuse online. "Firstly, over time, the internet has become more available across the globe," he says, "and more and more devices are able to capture images and videos, which adds to the numbers. You've also got the newish scenario of youngsters having mobile phones themselves – able to take images and videos of themselves and share them with what they believe is a trusted partner or friend, but those images end up becoming available online, too."

Advertisement
1545063846723-CB_181130_79890dip

In 2017, IWF analysts assessed a webpage every four minutes; every seven minutes, that web page showed a child being sexually abused. To ensure those on the front-line aren't left to deal with the impact alone, there are a whole number of welfare provisions in place at the organisation.

The analysts' working day lasts just seven-and-a-half hours, while breaks can be taken whenever an analyst needs one – and a minimum number are mandatory every day. Each of them must attend a mandatory counselling session at least once a month.

"It takes its toll, it's a tough job," says Henri*, one of the analysts, when he joins us in the meeting room. "But while all us analysts are very different, the one thing we have in common is that we have resilience. We can tell [the content is] awful – but we know we can handle it and carry on." He goes on to compare their work to that of a firefighter or an A&E nurse or doctor: it's unpleasant and painful, but you learn to carry on.

With a background in criminal and forensic psychology, Henri joined the IWF 11 months ago. After spending time researching child sex abuse in an academic setting, he wanted to do something more practical to help tackle the issue head on.

"There are certain images you see a lot," Henri continues. "There are some images and videos which seem to be more popular to be shared by paedophiles […] Something that always sticks with me is that you can almost watch children grow up in front of your eyes. You might see the child one day at five years old, the next day at 13 or 14, then 16 or 17. It's awful to think they’ve been abused through their life for so long. It took me a while to get used to that – watching someone grow up through abuse."

Advertisement
1545063945302-CB_181130_79738dip

Of course, Henri makes clear, you want to reach into the computer and save them. "Instead, the best we can do is investigate fully and get [the content] taken down, avoid them being re-victimised and hope one day they get saved." That said, when Henri and his colleagues do come across illegal content, they remain vigilant for any clues that might give away who either the child or the abuser is – and this can sometimes lead to a child being rescued.

"We had one incident recently where one of our analysts came across a video and did a report," says Henri. While a lot of what we see is recycled material, there are often new images and videos too." The video in question was of a 12-year-old girl, and it became apparent that she was British. Using their expertise, the analyst clocked some identifiable features. "He passed the report on to the National Crime Agency, as we do with all victims," says Henri, for the first time showing a hint of a smile. "They were able to intervene and save her from abuse just a few days later."

It's for exactly this reason that public reports made to the IWF are so important: a few clicks of a button from you or I could quite literally change a child's life. In 2017, the British public brought close to 60,000 suspected images or videos to the hotline's attention. While not all of the reports were within their remit (someone's age isn't always obvious to the untrained eye), some 35 percent of those reports were found to be illegal child sex abuse.

Advertisement
1545063962664-CB_181130_79930

Research conducted in 2017 found that 76 percent of 18 to 24-year-old men would report an indecent image of a child online. The biggest barrier preventing them from doing so is not knowing how or what to report. "It's very useful for people to report to us," says Henri. "It's important that if you come across these images, you report them – and it can be done anonymously. We don’t require contact details, although you’re welcome to leave them if you want feedback. That one report – even if you're unlucky to stumble across it – might see you save a child from sexual abuse."

That's where the IWF’s reporting site comes in. As long as it’s legal, what you get up to on your computer is of no concern to anyone. But the internet doesn’t always do exactly what we ask of it. The IWF isn't asking us to be vigilantes, setting out to find this stuff ourselves – far from it; to do so would be illegal. But if you happen to accidentally come across what you suspect might be child sex abuse while browsing, make sure to report it before closing the tab down. Who knows what it might lead to.

Find out more here.

*Name has been changed for security reasons.

@MikeSegalov / @CBethell_photo