David Cameron pointing. (Photo via)
In a directive announced yesterday, David Cameron said that he wants to enforce a blanket block on all online pornography in the UK. Only once you've had a conversation with your family or housemates about whether you really need a fibre-optic link directly into YouPorn's archives – and specifically told your internet service provider (ISP) that you don't want the block enacted in your home – will you be able to access everything you can now.
Besides the government's proposed rejection of our digital privacy, Cameron is also smudging the clearly abhorrent issue of child pornography in with all the other perfectly legal material on the web. Of course, every legitimate step should be taken to rid the internet of images of child abuse, but unfortunately – as Jim Killock from the digital rights organisation, Open Rights Group, pointed out to me – a blanket block isn't one of those steps. The paedophilia rhetoric is simply being used to garner public support and demonstrates just how little Cameron knows about what he's proposing.
I gave Jim a call to find out what the implications of Cameron's plans would be if he's successful in implementing them all, and how exactly he's going wrong in trying to cleanse the internet of child pornography.
Jim Killock, executive director of the Open Rights Group. (Photo courtesy of Jim Killock)
VICE: Hi Jim. So, Cameron’s announcement is about both child abuse images and filtering legal content. What gives?
Jim Killock: It’s really unhelpful to have to talk about how you deal with paedophiles at the same time as talking about how to deal with protecting children from legal but inappropriate content, because paedophiles are criminals. The material they’re dealing with is deeply unpleasant and needs to be removed from the internet entirely, not just filtered from homes.
So why is child porn being lumped in with all the legal material?
The reason they’ve been lumped together is because, as soon as people hear "child abuse" and "paedophilia", they're automatically looking for clamp-downs and ways to stop things. It’s about pushing people into very extreme measures where they’re not appropriate.
Would you describe the government's presentation of the issue as unfair?
It’s not just unfair. I don’t expect politics to be a fair game, but I do expect politicians to behave responsibly for the public good. I think if you lump these things together, you’ll get bad and dangerous solutions, and that’s where we’re going. We’re having an argument about whether or not search terms should be restricted or banned when we’ve yet to be presented with evidence that that's a major problem.
Would you say that putting pressure on Google and other search engines is Cameron’s easy solution to this problem?
They’re the easy targets, but it’s not really a solution.
Child abuse images aren’t typically accessed through Google anyway, are they?
As far as we know, most images are circulated by peer-to-peer networks – paedophiles working in secret networks. And it seems that’s far more commonly the way this material is circulated than by searches. Apparently there's a case being made that the first rung on the ladder [while looking for child porn], so to speak, is people searching on public search engines. Whether that step can be broken in this way is really a discussion to be had. Personally, I'm sceptical.
I see. So what’s the true solution to getting explicit child abuse imagery off the internet?
The main solutions are investments in policing, so that you can target the users and networks where this is secretly distributed out of sight of search engines. That’s the first thing. The second thing is that you ask international governments to speed up the take down of publicly available child abuse images on websites. The third thing is looking at money laundering, which is the vehicle by which child abuse images are turned into useable cash by criminal gangs. Those are the things that would most reduce the availability of these images and improve the convictions of people who are circulating them.
That sounds much more difficult than just shouting at some internet companies.
Yeah, it is, but it’s not impossible. Banks, for instance, remove phishing sites within hours; you get spam emails inviting you to give your credit card or bank password to a fake bank site – that site would have disappeared within a matter of hours. Sites selling paedophilic material take days to remove – that’s how they continue to trade. And you have to ask why that is – why is it that banks can get the job done but governments find it much more difficult? It’s about money.
How do you think these filters will affect access to the rest of the internet?
I think a really important thing to remember is that these filters won’t just be about pornography. They'll also be about extremist content and they'll likely include alcohol-related sites, like websites for pubs and bars. They're likely to include sites that promote tobacco and they'll probably include political content identified as potentially being extremist.
I wouldn't be surprised if the VICE site was affected by these filters. I know that the Open Rights Group site is already blocked in libraries – the reason being that we’ve mentioned porn and talk about those kinds of issues. However, our site does not feature pornography. So, with these kinds of problems, you want to limit them – you want to apply the filters where they’re needed, not broadly and indiscriminately. That’s the problem with encouraging people to switch them on – the result may be very, very poor.
Won’t proxies, VPNs and Tor networks get around the filters anyway?
I think the thing to remember is that evading blocks is easier when you have a motivation. Teenagers who want to see porn because they’re sexually curious are going to be extremely difficult to persuade not to access blocked material. It’s not a good way to think about this problem. It’s not to say that filtering shouldn't happen, but you have to realise that a large number of people – particularly the people who are regarded as having the most problematic internet activity – are going to be the hardest people to stop.
The likes of Google and the Internet Watch Foundation already help to identify illegal content, so what does this new directive actually add?
I don’t think it adds a lot. What is needed from Cameron is money and resources, and he doesn’t seem to be offering that. He seems to be offering to get other people to provide solutions. Again, if he was announcing major new money for new people to do investigations then it might seem a bit more reasonable. But focusing it on search engines seems extraordinarily narrow.
Would you say that, rather than consulting technologists, he’s making a political stance on this issue?
I think that’s right. He hasn’t even fully discussed this with the companies, it seems. So it’s unsurprising that this all seems a bit ill-thought out.
Finally, do you think this action will definitely come into force?
The search engine stuff is going to be very problematic and take a lot of discussion. He may get some results – who knows? With the ISP filtering, some of that will happen, but hopefully not in the form he’s been suggesting.
Okay. Thanks, Jim.
Follow Sam on Twitter: @sambobclements
More on Cameron's porn block: