Sweet, my day of Increasingly Erratic Privacy Blogging isn’t half way through yet and I was starting to worry about having enough to say. Luckily, there’s this:
Do you expect the machine to solve the problems? In this wide-ranging interview with the Director of the Open Rights Group we discuss bulk collection, state bureaucracies, the pre-crime era and trust.
Right up my alley.
[Ken Macdonald QC, former Director of Public Prosecutions] stated that public trust in the organs of the state was going to be crucial, because from then on, ”Finding out other people’s secrets is going to mean breaking everyday rules of morality.”
I’m not sure that’s quite right. I think those everyday rules of morality increasingly don’t apply. I don’t think we understand our own responsibilities when it comes to our data. We’re even less equipped to identify a culprit when something bad comes of the decisions we make in an evermore connected world. We badly and urgently need to change what we think of as everyday rules of morality to reflect how the world is now. This applies to us as citizens of the internet as well as to the bodies that administrate our lives.
Now, what the paper completely fails to address is how that precondition, that essential public trust, could possibly survive a system under which the security services were empowered by law to routinely trawl through the private communications data of vast numbers of citizens suspected of no crime, simply in order, as Sir David Omand puts it, ‘to identify patterns of interest for further investigation’. How would the public regard their security services in that world?
If you live…well, at last count anywhere… that question has been largely answered. Our trust has been funnelled elsewhere. We’re supposed to trust our governments to accurately strike a balance between some nebulous idea of security (against terrorists and criminals) and the giving up of freedom. We’re supposed to trust them when they minimise or ignore the potential costs of terrifying security measures. They are careful to make it difficult for us to understand the threats and especially the consequences of the claimed countermeasures.
Of course, such a world would change the relationship between the state and its citizens in the most fundamental and, I believe, dangerous ways. In all probability, it would tend to recast all of us as subservient and unworthy of autonomy. It would destroy accountability and it would destroy trust.
Well let’s hope so, I don’t see any of that happening, yet. We’re not, as societies, challenging the decisions made by our governments on our behalf about the eradication of fundamental freedoms. Plainly, this is by design; our governments could choose to educate us and involve us in those decisions if they wanted to, That they invariably choose otherwise suggests they know that nobody in their right mind would agree to many of those decisions if they understood the consequences.
This is for one very simple reason: because to abolish the distinction between suspects and those suspected of nothing, to place them entirely in the same category in the eyes of the state, is a clear hallmark of authoritarianism.”
Personally, I’d call it fascist. It’s one thing to investigate a suspect by looking at the footprint they leave on the world. It’s quite another to generate suspects that way.
Jim Killock responds:
It is much easier to oppose something when it hasn’t apparently happened: to anticipate the problems and say, “We don’t want this kind of power to exist”.
As soon as you’ve materialised that power, and that is what has happened under ‘bulk warrants, bulk collections’, it is much harder to say, “Well actually the billions of pounds that you have invested in this system, the integration with the NSA that you have done for strategic reasons – that must stop. I wish to oppose this, to dismantle it, and essentially wish you to turn your back on the investment you have put in.”
He says a number of other correct things, too. You’d be crazy not to read it all.