Wednesday, 19 November 2014

Government believes saying a thing makes it true, surprising nobody

“The UK's major internet service providers (ISPs) are to introduce new measures to tackle online extremism, Downing Street has said.”

The ISPs seem bemused because they didn’t agree to any such thing.

Campaigners called for transparency over what would be blocked.

Did we? I’m pretty sure we campaigned for there to be no filtering at all and no government interference with ISPs but since this is obviously going to happen I’d certainly prefer transparency, accountability and judicial oversight.  Since the government apparently hasn’t even told ISPs what they’ve supposedly already agreed to, this seems a forlorn hope.

Prime Minster David Cameron said technology companies had a "social responsibility" to deal with jihadists.

They have a social responsibility to resist governments telling them what people can and cannot see and do. Government agendas should not influence people’s access to information, We have laws for that sort of thing. Laws that are independent of any particular government. For the most part. In principle. Probably.

In a briefing note, No 10 said the ISPs had subsequently committed to filtering out extremist and terrorist material, and hosting a button that members of the public could use to report content.

I’ve no idea what that means. Every time I try to think about it, I picture the CEO of some ISP hitting a big red button on her desk causing lots of alarms to ring and everyone to run around in a blind panic but no terror attacks actually being averted.

Apparently:

It would work in a similar fashion to the reporting button that allows the public to flag instances of child sexual exploitation on the internet.

But that reporting button appears to belong to the police, not to the hundreds of ISPs in the UK. That’s because child abuse is a matter for the authorities, as is grooming and violence of other kinds. Why would anyone report stuff like this to their ISP? Who would even think of it? And if they did, it wouldn’t be very safe. I use Twitter to complain about idiots and talk about my cat. I wouldn’t use it to blow whistles. ISPs have no procedures to protect people reporting nasty practices and nor should they. It isn’t their job. And how would you complain if you thought your ISP was complicit? It’s the wrong solution in the wrong place and everyone knows it.

I don’t even know what threats the government is trying to address and neither do you. Neither does the government.  That might explain why the countermeasures are so blithering, ineffective even in principle and under nobody’s oversight.

Unsurprisingly, the ORG talks sense:

We need the government to be clear about what sites they are blocking, why they are blocking them and whether there will be redress for site owners who believe that their website has been blocked incorrectly.

Given the low uptake of filters, it is difficult to see how effective the government's approach will be when it comes to preventing young people from seeing material they have deemed inappropriate.

Anyone with an interest in extremist views can surely find ways of circumventing child-friendly filters.

Well quite. Governments shouldn’t get to weasel out of their responsibilities. ISPs aren’t like gas companies. Gas companies are responsible for people not being blown up unless they deliberately vent a load of gas into their house and strike a match. Actually, I’m not sure where I’m going with this analogy because it would involve gas companies deciding what people are allowed to cook or how they should heat their home. Actually, maybe it’s a decent analogy after all: if my gas company decided I was using too much gas to heat my house I’d probably light all the hobs on my oven to generate some extra heat. I’d probably do it just to piss them off.

To help deal with the problem, the Met Police set up a dedicated Counter Terrorism Internet Referral Unit (CTIRU), tasked with trying to remove terrorism-related material.

I have no problem with this in principle. It sounds like the sort of thing the police (not ISPs) ought to be doing.

Since its inception in 2010, CTIRU has removed more than 55,000 pieces of online content, including 34,000 pieces in the past year.

Kind of worried about the practice, though.

.

No comments:

Post a Comment