Wednesday, 27 May 2015

Scotland Yard wants to spy on us now

Closer vigilance of UK Muslims now neededThis is Scotland Yard commander Mark Chishty.  He says that Islamic propaganda is influencing children as young as five and that this should be countered with intensified monitoring to detect the earliest signs of anti-Western sentiment.


The Guardian says so

Scotland Yard commander Mak Chishty said children aged five had voiced opposition to marking Christmas, branding it as “haram” – forbidden by Islam
So? I’m not particularly fond of Christmas myself.  I’m not sure I see a huge difference between five-year-olds being told that Christmas is haram and being told that it’s of magical importance.
He also warned that there was no end in sight to the parade of British Muslims, some 700 so far, being lured from their bedrooms to Syria by Islamic State (Isis) propaganda.
That might be the case, but I’m having difficulty in understanding why that’s the Met’s business. I’m not sure it’s anyone’s business, although I’d personally very much prefer it if people here and elsewhere didn’t go to fight for IS.
In an interview with the Guardian, Chishty said there was now a need for “a move into the private space” of Muslims to spot views that could show the beginning of radicalisation far earlier.
And here we have the part that’s nobody’s – and certainly not the Met’s – business.  ‘Moving into the private space’ sounds fairly innocuous, doesn’t it? An operational matter so routine that it has its own jargon. But what it means is spying:
He said [radicalisation] could be shown by subtle changes in behaviour, such as shunning certain shops, citing the example of Marks & Spencer, which could be because the store is sometimes mistakenly perceived to be Jewish-owned. Chishty said friends and family of youngsters should be intervening much earlier, watching out for subtle, unexplained changes, which could also include sudden negative attitudes towards alcohol, social occasions and western clothing.
So he doesn’t want to spy on people himself, he wants everyone to spy on each other. And then report behaviour he personally defines as suspicious to the police.  What does he imagine the police are going to do about it?  Arrest people for not wearing M&S pants?
But some will argue that his ideas walk a fine line between vigilance in the face of potent extremist propaganda and criminalising thought.
There is no “fine line” here. He’s saying that people with certain beliefs or who do certain benign things out to be subject to automatic suspicion and unspecified police action, regardless of whether they’ve done anything wrong.  I think it’s essential to prevent grooming of all kinds but I doubt that spying on those groomed is the way to achieve this.
“We need to now be less precious about the private space. This is not about us invading private thoughts, but acknowledging that it is in these private spaces where this [extremism] first germinates. The purpose of private-space intervention is to engage, explore, explain, educate or eradicate. Hate and extremism is not acceptable in our society, and if people cannot be educated, then hate and harmful extremism must be eradicated through all lawful means.”
On the contrary.  We need to be more “precious” about private space, precisely because people like him wish to invade it.
Asked to define “private space”, Chishty said: “It’s anything from walking down the road, looking at a mobile, to someone in a bedroom surfing the net, to someone in a shisha cafe talking about things.”
Gaze is suspicious? Our looking at our phone is something the police might need to know about?
Questions should be asked, he said, if someone stops shopping at Marks & Spencer or starts voicing criticism. He said it could be they were just fed up with the store, but alternatively they could have “hatred for that store”.
It’s perfectly fine to hate stores.  I hate the Disney store.  Hey, wait, Hollywood is often mistakenly perceived to be Jewish-owned so my dislike of the Disney store is obviously hate thought. And therefore my friends and family should report me to the police. What’s wrong with you, friends and family?
He said the community should “look out for each other”, that Isis was “un-Islamic”, as proven by its barbarity.
And that is not a police matter. He’s trying to police his version of Islam with actual police.  He doesn’t get to decide what behaviour is suspicious or what private spaces the police get to invade.

And thank fuck for that.

Tuesday, 26 May 2015

No surprise that hackers are targeting health insurers

They know lots about us.  They know lots about us just by asking us, who knows what other data about us they routinely buy?

The BBC reported on yet another successful attack on a US health insurer (CareFirst) in which 1.1 million customer records were stolen.  It’s peanuts compared to previous attacks on Blue Cross (probably not the UK pet rescue charity of the same name), which lost 11m and Anthem, which lost 80m.

The CareFirst database accessed included member names, birth dates, email addresses and identification numbers.

It did not include social security numbers, medical claims, employment , credit card or financial information, the company said.

But let’s be cynical.  Data about medical history is not mentioned. I feel fairly justified in assuming that stated medical conditions, medication and treatment taken, drinking and smoking habits, weight, occupation etc. might have been among the stolen data.  And “identification numbers” is terrifyingly vague. Identifying of what?

"We deeply regret the concern this attack may cause," CareFirst chief executive Chet Burrell said.

They regret the concern, but not the actual harm?

"We are making sure those affected understand the extent of the attack - and what information was and was not affected."

And that still won’t help their customers understand what new threats they face.  It doesn’t tell them what – if anything – they can do to mitigate risk and minimise damage.

We have to force companies to be more open about and responsible for the data they harvest about us.

One more thing:

The breach took place in June last year but was only recently discovered.

Discovered by CareFirst?  Or discovered by someone else, forcing CareFirst to finally admit it?  Either way, their customers’ data has been out in the wild for a year without their knowing.

Vulnerabilities accrue over time

I was thinking a week or two ago about writing something about how security and privacy vulnerabilities can accumulate, sometimes gradually, sometimes in leaps and bounds, often in unexpected ways.  It’s part of the answer to the frustrating question:

Why should I care about privacy if I have nothing to hide?

which, as I’ve said before, is akin to the creation idiocy:

If we came from monkeys, why are there still monkeys?

Although I’ll concede that the latter is the more idiotic by a considerable margin, they could be equally harmful in their various different ways.

But I haven’t had time yet, so read this by Cory Doctorow instead as a good example.  It was one I too had in mind for my post (honest).

Cory talks about Logjam, which lets attackers intercept apparently secure communications by tricking browsers and servers into using weak crypto.  Many servers operate a weak crypto mode which looks – to the browser – as though strong crypto is being used.  As Cory explains, this is an artefact of Clinton-era legislation ruling the exportation of strong crypto illegal, classifying it as a weapon.  Weak crypto was a backdoor used across national boundaries so that US security agencies could intercept encrypted messages whenever they wanted.

Because of how the internet (and software development and distribution) works, there are still many servers out there supporting the weak crypto mode and we suddenly have a problem far worse than anyone thought at the time (and we thought it was pretty disastrous then):

But it's not the 1990s anymore. Crypto doesn't just protect the Web -- it secures your car's wireless interface to keep attackers out of your brakes and steering; it secures your pacemaker against wireless attacks that can kill you where you stand; it secures your phone against having the camera and mic remotely operated by "sextortionist" voyeurs who blackmail their victims into performing live sex acts on camera with the threat of disclosure of nude photos covertly snapped by their compromised networked cameras.

You might feel you have nothing to hide.  Maybe you really don’t (although I doubt it. It’s easier to believe that you just don’t have a very good imagination), but you certainly have something to lose.  Insecure or blabbed conversations about entirely innocent things can still be harmful.

And, of course, we didn’t evolve from monkeys.  We share an ancestor.

I’ll get around to finishing that more general post one of these days, hopefully.

Beware signing away image rights

Most of our personal images seem innocuous.  Unless we’re trying to make money with our photographs, there seems little harm in allowing others the right to use them.  Is anyone really, after all, likely to be interested in seeing us sunburned in front of a monument?  We make that assumption all the time when we use social media.  In many cases, the terms and conditions of social media services insist on the rights to the images we upload, even after we’ve deleted them. 

This couple thought that.  A photographer offered them a free photoshoot in exchange for their agreement that he could sell the images to stock photo sites.  They gladly agreed, thinking that in the unlikely event it was ever used, so what?  Unfortunately, the image turned up here:

an image used for the campaign against equal marriage in the recent Irish referendum.

The couple are keen to stress that the campaign obtained the image legally and that they weren’t tricked in any way by either them or the photographer.  It’s just that they don’t agree with the message and are understandably uncomfortable that they unwittingly contributed in some way to the No campaign.

It’s a lesson to all of us, especially those who feel that privacy isn’t important if you’ve nothing to hide.  In this case, permission for the image to be used was pretty explicit.  In other cases – such as those involving social media sites insisting on owning image rights – not nearly so much.

For the record:

In a statement released via Human Rights campaign group Amnesty International in Ireland the pair laid out their own views on the gay marriage debate, although as non-Irish citizens they were not involved in the vote.

"This family believes that everyone has a right to marry the person they love regardless of their gender," they said.

"And this family would vote Yes [in favour of legalising same sex marriage]."

Also for the record: me too.

Wednesday, 20 May 2015

Evil Wednesday roundup of evil now with 19.4% more evil

  • An airline has launched a reward programme for finding bugs in its software.  However, “The bug bounty programme does not cover software used in the jets in United's fleet of aircraft.” It’s just its websites, which is a shame.  It’s a good thing for all that, though.  Who knows what kind of mischief someone could get up to if they compromised an airline’s network?  Give it a go and win yourself some airmiles.
  • Did that guy really hack that plane? I have my doubts.
  • Facebook tramples on European law.  Belgium isn’t happy with Facebook’s cavalier attitude to its customers’ and it’s non-customers’ privacy. “The body, which was working with its German, Dutch, French and Spanish counterparts, said that Facebook would not explain in detail how it used data it collected.” Yeah, it’ll do that.
  • What the BBC thinks net neutrality is. It’s kind of OK as far as it goes but why was it a video instead of, like, a paragraph? All it manages to do is make a paragraph take two and a half minutes of your life.  I’m tempted to come out in favour of charging news organisations more to stream pointless videos at license-payers’ expense.
  • Here’s one that actually sense as a video: https://www.youtube.com/watch?v=rmQtKc9MccY
  • A stupid article about stupid things, just to piss you off.  It’s a service I provide entirely for free.  And while we’re at it, let’s never make people predicting the end of email into a drinking game or half of civilisation will be wiped out on the spot. The article actually says that the asynchronous nature of email is bad.
  • Wireless comms and things that leak EM are banned here, for SCIENCE!  I haven’t decided what I think of that yet.
  • People standing up to Internet.org. These people are officially NOT EVIL. The standy-uppy people, that is. Not, you know, the evil people who support it. It turns out that surveillance isn’t such a great business model when people can’t afford to be sold things. Tough fucking shit, ISPs, Facebook etc. Creating a channel where only your stuff can be bought isn’t an act of fucking charity. And what in all of technicolour thundering fuck is the encryption ban all about? There’s profit in surveillance of poor people after all. Providing you can control the things they’re able to buy.
  • Is this a joke? Isn’t every single form of communication other than semaphore better than this? Rory Cellan-Jones is a known idiot, but wtf: “Chirp was launched in 2012. It's an application that allows you to transfer files between devices simply using an audio signal. It was instantly appealing”. If by “appealing” he means “stupid” then I guess he’s right.
  • Stop trying to ban crypto.
  • Americans against being spied on by America.
  • US officials leak information about the ISIS raid that’s more sensitive than anything Snowden ever leaked. “Read that carefully and pretend it was Snowden who leaked this information, instead of nameless Pentagon spokesmen.”

Rebooting the fridge

Our fridge stopped working last night and we had to reboot it.  Truly we live in an age of wonders.

Our last fridge had a slight design flaw, which caused some of them to explode rather than keeping your stuff cold.  A man fixed it by plugging it into his laptop with an ethernet cable and upgraded the firmware.  That is pretty awesome.  And it obviously worked, since the fridge did not explode.

I’m not going anywhere with this, I just think it’s pretty cool.

Friday, 15 May 2015

Woman allegedly fired for not wanting to be tracked 24/7

A company called Intermax insisted that all it’s mobile staff members install tracking software on their phones.  One employee removed the app and was first told off and later fired. You can read about it here.

According to the lawsuit, Ms Arias's manager "admitted that employees would be monitored while off-duty and bragged that he knew how fast she was driving at specific moments ever since she had installed the app on her phone".

She was required to keep her phone on and with her at all times, in case a client called.  She says she didn’t mind being tracked during work hours but drew a line at being tracked while off duty.

She likened the app to a prisoner's ankle bracelet.

I don’t know why some companies are so obsessed with snooping on their staff.  Generally, the most productive and effective people I’ve ever worked with were given a very long leash (often by me) and I trusted them to do a good job unless there was evidence of a problem.  I never once regretted it, even though there were (very) occasional problems. 

Another way

Image result for surveillance

I was recently talking with a friend who works at the server farm end of IT with clients including most of the big UK telcos.  I was complaining about the fact that the principal business model for telco operators is surveillance and expressing concern at the government’s determination to force them to hand over our metadata without a warrant.  He didn’t see things my way, probably because providing this surveillance capability is how his unit makes his money.  He said something like this:

The problem is that everyone wants free texts and calls and data and they want their monthly bills to be low.  The only way telcos can do that is by making money in other ways.  Surveillance is the only way they can offer services at the price people want.

This is a superficially reasonable argument with only one slight flaw: it’s bollocks.  It presupposes that surveillance is the only way it can be done and that there aren’t enough people who value privacy enough to want to pay for it, either explicitly or in the form of a higher tariff. It’s a justification rather than a business case.  Maybe there aren’t enough people like me who would pay for privacy, but as far as I can see, the telcos aren’t asking.  Other business models are possible and might even be successful.

Here’s an example of another way.  It’s not specifically based on privacy (although the article mentions private browsing) but it shows that there are other ways of extracting money out of people:

FreedomPop will offer Sim cards that offer 200MB of data, 200 texts and 200 minutes of voice calls per month using the cellular network at no cost.

The company already offers a similar free mobile data plan in the US to more than half a million users.

The firm, which is backed by Skype founder Niklas Zennstom, says it will make money by selling extra services.

And, of course, charging big fees for exceeding the limits.

The article gives some examples of the paid-for services the firm might offer, including the ability to roll unused data, calls or text over to the following month, anonymous browsing and the ability to add a second number in a different country so customers can call that country at local rates.

Anonymous browsing isn’t going to solve the telco privacy problem and I’ve no idea whether this – or any – firm will come up with a solution based around privacy. But it’s nice to know that there are other business models out there.

Wednesday, 13 May 2015

EFF on the illegality of the NSA phone dragnet

Is here and it’s good.

Tories plan new Snooper’s Charter, here’s what we can do

Cory Doctorow writes at Boing Boing:

Ed from the Open Rights Group writes, "The Conservatives have won an absolute majority in the General Election. The Home Secretary Theresa May has already said that she will use this majority to pass a newSnoopers' Charter."

You know the charter I mean, the one that forces our ISPs and telcos to spy on us by retaining huge amounts of personal information about all of us and handing it over to government and law enforcement without a warrant.  We’ve defeated the Snooper’s Charter repeatedly in Parliament, but the Tories are certain to try to force it through if they possibly can; Theresa May has already announced it.  Last time, it was blocked by the Lib Dems, but the Tories’ new majority means that it’s likely to succeed when they try again.

The light at the end of the tunnel is that the Conservatives' majority is tiny.Their leadership will have to work incredibly hard to secure a majority for new laws. Every MP's vote will count and this presents a huge opportunity for campaigns like ORG's to influence what happens.

I’ve had little success in interesting my MP, Phil Wilson (Labour) in this issue. He toes his party line all the way, which is perhaps to be expected from the holder of one of the safest Labour seats in the country.  But even so, I suspect that part of the issue is that he doesn’t fully understand the implications of the Snooper’s Charter.  Perhaps the same can be said for the other candidates for Wilson’s job at the last election.  I contacted them a few weeks ago and had a very disappointing response.

So I think we need to educate our representative.  Groups like ORG are going to be invaluable at doing this in the next few months.  Do consider joining.  Do consider looking here to see if there’s a local ORG group you can join or think about starting your own.  There’s a North-East Local ORG Group, which I’ve recently become involved with.  There’s a lot of strong feeling in that group about the Snooper’s Charter and a lot of expertise and enthusiasm that we intend to bring to bear on this problem.

The Tories plan several other terrifying things according to their manifesto:

  • Introduce "new communications data legislation" - also known as the Snoopers' Charter,
  • Scrap the Human Rights Act
  • Require internet service providers to block sites
  • Enable employers to check whether an individual is an extremist,
  • Requir[e] age verification for access to all sites containing pornographic material" - which is very difficult to implement

And David Cameron has said that every message we send should be readable by the state - even when we've encrypted it.

Yeah, let’s not do any of  those things.

If you’re in the North East and would like to join our Local ORG Group, get in touch.  If you’re a member of a different Local ORG Group and think it would be useful to coordinate some activity, also get in touch.

Snowden vindicated

From The Atlantic:

Edward Snowden’s most famous leak has just been vindicated. Since June 2013, when he revealed that the telephone calls of Americans are being logged en masse, his critics have charged that he took it upon himself to expose a lawful secret. They insisted that Congress authorized the phone dragnet when it passed the U.S.A. Patriot Act, citing Section 215, a part of the law that pertains to business records.

And now:

A panel of judges on the Second Circuit Court of Appeals ruled last week that the program Snowden exposed was never legal. The Patriot Act does not authorize it, contrary to the claims of George W. Bush, Barack Obama, Michael Hayden, Keith Alexander, and James Clapper. “Statutes to which the government points have never been interpreted to authorize anything approaching the breadth of the sweeping surveillance at issue here,” Judge Gerard E. Lynch declared. “The sheer volume of information sought is staggering.”

In other words, Snowden’s blowing of the whistle didn’t expose a legitimate state secret, so he’s vindicated.

Snowden undeniably violated his promise to keep the NSA’s secrets.

But doing so was the only way to fulfill his higher obligation to protect and defend the Constitution, which was being violated by an executive branch exceeding its rightful authority and usurping the lawmaking function that belongs to the legislature.

At least with respect to his exposure of the phone dragnet.

Smart Watches

l_10079874[1] I’m writing some phone and watch apps at the moment.  I have mixed feelings about smartwatches because many of them try to be a copy of your phone that happens to be on your wrist.  Much of the time, it’s better to just get your phone out and have done with it.  I think we need our smartwatches to do things our phones don’t.  This smartwatch I have is the Moto 360, which runs Android Wear.  Wear is not like other smartwatch platforms in a good way…. but there are some serious privacy concerns as I’ll explain.

Most smartwatches use the app metaphor.  You open up an app on your watch – as you would on your phone – and interact with it.  The appeal is obvious; it’s familiar.  We’re used to it from our phones and PCs.  But I don’t think it works well on a watch.  More often than not, I prefer to get my phone out when I need to interact in detail.

Wear is different.  It is centred around the idea of notifications and responses to notifications, along with other things like voice search, which I’ll get to in a moment.  Notifications take the form of ‘cards’.  A ‘card stream’ is a vertical list of notifications, in the order they are received.  You swipe up and down to scroll through the cards and left to remove a card from the stream,  Some cards have additional information and/or controls. To access these, swipe right.  For example, a card might show the current weather at your location.  Swiping right might show a longer-term forecast and swiping right again might reveal controls for opening the forecast in still more detail on your phone.  Another example is a person’s contact details; swiping right might provide controls for emailing, texting or calling that contact.

And that’s pretty much the entirety of the interface.  There is no real concept of apps (or rather, that concept is seriously downplayed).  On the face of it, this sounds very limited, but this is where voice search comes in.  Tapping the watch face or saying “OK Google” will bring up a search interface.  The watch will recognise what you say.  If it’s a command it understands, such as “show me my reminders”, it will do so in the form of a set of cards.  If you ask instead to “remind me to feed the cat at 5pm today” it will schedule a reminder for that time, with the transcribed text. If you say something it doesn’t understand, it will search for that information.  Wear is smart enough to understand in many cases what it’s searching for and will format the resulting cards appropriately.  For example, I recently needed to know whether a particular shop in a particular town was open that day.  I didn’t know the name of the shop, but I knew that it was a cookshop. So I asked my watch “is the cookshop in X open today?”. It came back with a card showing the name of the cookshop in that place (and additional cards for other stuff when I scrolled down).  Swiping right gave me a card, correctly formatted for the display, showing me the opening hours.

That’s exactly what I wanted.  I didn’t want to do a web search on my watch and squint through the answers until I found the one I wanted.  I wanted to get at the exact information I wanted without effort and in a few seconds and this is what Wear allowed me to do.

In fact, Google’s guidelines in developing for Wear emphasise this quality.  It says that if your users can’t do what they want within five seconds, you’re probably designing it wrong.  And there’s lots of stuff going on behind the scenes to make this easy.  Wear depends on Google Now, which you might not be familiar with.  Now is a search platform that integrates Google’s web search and everything else it knows about you to generate cards that are context relevant.  This is considerably more advanced and ambitious than other PDAs such as Siri and Microsoft’s Cortana.  It’s the sort of context-dependent stuff I’ve been saying for a while that we need.

For example, my phone has sensors that tell it whether I’m in a moving vehicle.  I’ve told Now where my home is and where I work, so if my phone detects I’m in a vehicle, at the time I usually go to work and appear to be travelling in that direction, it will show me any traffic incidents along the route,  If I’m driving somewhere else and I have a calendar entry saying I have a meeting, it will show me traffic incidents on the way to that meeting’s location.  Even if I didn’t put the meeting in my calendar, it will scan my gmail to see if I mentioned a meeting there and work out that’s where I’m probably going.  I can say “give me directions to my next meeting” and it will fire up the turn-by-turn navigation using my phone’s speaker and illustrating the next turn on my wrist.

This is the kind of thing I want from my watch.  I want it to know where I’m going and what I’m doing and give me information relevant to that situation without too many false positives or false negatives.  Now – and therefore Wear – is pretty good at this.  But, needless to say, it comes at a great expense of privacy.  Here’s an example that might explain why.

I don’t use my gmail account for everyday mail.  But when I found out about Now’s ability to parse email, I forwarded the confirmation email from Expedia about my upcoming holiday to my gmail account and then – seconds later – asked my watch “when am I going on holiday?”  Wear responded with a correctly-formatted version of the itinerary.  It obviously scanned the email, recognised it as (among other things) an itinerary and formatted it as such.

But here’s the problem.  I never explicitly gave Google permission to do that.  The Now infrastructure has enormous power to search through all the things it knows about me in seconds, based on situation.  This is very highly specific (and therefore highly valuable) information.  It could be used, for instance, to predict what I’d be likely to do in some circumstances. 

I haven’t given Google explicit permission to do this and there’s no easy way to control it.  I could refrain from using my gmail account, but now that I’ve send it my holiday itinerary, there’s no way I can remove that knowledge from Now.  Now has access to all my other Google-related data, too.  It knows my search history, my calendar, my Hangouts history, my browsing history, where I take my phone and what I do with it when I get there.

All this information makes Wear very powerful and – for the things I want my watch to do – very useful.  But I had no idea that Google had this capability and I have no real control over the data included in Now’s searches.  I think this is inexcusable.  I might be happy to spend some of my privacy to make my life a little more convenient, but I want to know what data I’m bleeding and what it buys me.  I want to be able to take away my permission for access to certain data and know how that will impact my Wear experience.

I think Wear has the right idea, but it locks us into more surveillance by Google.  I wish Google would give us the choice.

Friday, 8 May 2015

On the other hand

Despite what all those ex- and current FBI and CIA and who-knows-what directors say in the press, the greatest terrorist threat is from lone wolfs (lone wolves? I’m going with wolfs).  All things being equal, it’s harder to detect suspect activity when nobody is talking about it.  The bigger the group, the greater the chance of a weak link. A little bomb can do as much damage as a big one if it’s planted in the right place.  Attacks on aircraft rightly woke us up to a new kind of attack.  Bombs in pre-security queues at airports seem like a logical next step and I’m surprised this hasn’t happened yet.  It wouldn’t take the coordination of a 9/11. It would take individuals with a fairly basic understanding of chemistry. 

But on the other hand, lone wolfs are more likely to be detected by mass surveillance.  Someone on a housing estate buying large amounts of fertilizer, for example. People searching “how to make bombs”.

That sounds like a good case for mass surveillance but it isn’t. We have to think about the trade-off between false positives and false negatives and the relative impact of each. But we’re not doing that.  I suspect that mass surveillance doesn’t help us crack large terrorist organisations to a significant degree. I suspect it might help us chase down individuals who might be or not be wannabe terrorists.  But I’d sure as shit turn up as a false positive just because of the sort of things I write about.

That’s the bewildering false dichotomy we tend to hear from former directors of intelligence agencies, isn’t it?  We give up [unspecified thing] for [unspecified reward] otherwise we’re [probably fucked for unspecified reason].

There are other options.

A gift to terrorists

The CIA’s former Deputy Director, Michael Morell has blamed Ed Snowden for the “rise of ISIS”.

We knew this and he’s not the only senior member of the intelligence community to say so.  But he’s being talked about a lot right now because he’s written a book.

Morell, who makes the claims in a new book, says the most damaging revelation was the existence of a spying program that collects foreigners' e-mails as they move through equipment in America.

He said the jihadists subsequently switched their messaging systems to more 'secure' platforms, encrypted them or 'avoided electronic communications altogether'.

I’m caught between three reactions:

  1. Doubting that Morrell knows whether terrorists really have changed their approaches or whether the causal link he suggests really exists.
  2. Wondering whether their supposed moving to more conventional means of information exchange is really such a bad thing for the intelligence services.
  3. Still feeling that mass surveillance seems too large a price to pay for unspecified benefits to anti-terrorism and that it’s the citizens of a nation who should make that decision, not the intelligence services acting in secret.

I don’t know enough to evaluate 1 and 2, but I’m fairly certain about 3.  I might be willing to give up some specified privacy in exchange for well-reasoned and well-described counter-measures against a tangible threat.  But I’m not prepared to give up – without being asked – any and all privacy in exchange for dubious, unspecified counter-measures against threats I’m not even sure exist.

“ISIS was one of those terrorist groups that learned from Snowden and it is clear that [Snowden’s] actions played a role in the rise of ISIS.”

I haven’t read the book yet, but I very much doubt that the situation is quite so clear as that.  There seem many reasons why ISIS has ‘risen’ and they’re mostly reasons of politics and opportunity.  ISIS isn’t a typical terrorist group, operating as a loosely-connected group of cells.  It’s a coherent and very visible organisation that isn’t even trying to hide.  Better access to its communications would doubtless help the CIA in combating ISIS, but that isn’t the CIA’s job.  It’s not trying to do that.  Better access might help the CIA counter terrorism (which is its job), but that has nothing to do with the rise of ISIS.

“At a time when the range of threats against the West has never been greater, with Yemen, Iraq, Syria, Somalia and Al Qaeda in Pakistan, it is astonishing the focus has been more on the shortcomings of our intelligence agencies and not the fact Snowden has helped terror suspects drop off the radar.”

The ‘shortcomings’ being that they are secretly collecting every piece of information about us all that they possibly can, often illegally.  And here’s the thing: encrypting communications isn’t difficult. Assuming that communications are vulnerable to interception is where secretive organisations begin. It’s tradecraft.  And there are plenty of manuals available about how to do it, which are not manuals for terrorism but for reasonable expectation of privacy and are equally valid post-Snowden.

There’s some worrying rhetoric, too:

“[Snowden] has caused severe damage to our ability to fight extremism.”

Telling, isn’t it? We’re not fighting ‘extremism’.  Many people hold ‘extreme’ views.  But many extremists aren’t dangerous and many dangerous people aren’t extremists.  We’re fighting people who threaten us and the holding of extreme views is not in itself a threat.  It’s the kind of throwaway comment that worries me.

MI5 director general Andrew Parker has called the traitor's actions a 'gift to terrorists'.

At least the Daily Mail isn’t afraid to say where it stands.  I wonder what the DM thinks Snowden is a traitor to, though.

Morell pointedly criticizes the National Security Agency, saying it was conducting highly sensitive surveillance of allied leaders without fully considering the appropriateness of its operations.

The NSA, he said, had 'largely been collecting information because it could, not necessarily in all cases because it should.'

And yet he bemoans the fact that this behaviour was exposed. But that kind of contradiction is bread and butter to Morell.  He simultaneously praises and decries torture, for example.

While Morell says he is personally troubled by the harshest technique the CIA used on detainees, water boarding, he makes a case that agency leaders had no choice but to use what many consider torture in the years after the 9/11 attacks.

I doubt he was as uncomfortable as the victims.

Wednesday, 6 May 2015

Local students schooled in cyber-security

I’m all for teaching students about security, although my methods differ.

STUDENTS have come face to face with the potential perils of social media in a series of hard-hitting workshops designed to keep them safe.

Cyber safety experts spent the day at Darlington School of Mathematics and Science (DSMS) highlighting the potentially negative physical, social and psychological consequences of using the internet.

It sounds good so far, but well-meaning education on safety can have profoundly harmful effects.  For example, consider attempts at safe sex education that consider only abstinence.  It is known that such programmes are ineffective where it counts.  I’m by no means suggesting that this approach by Durham Police and Harbour Support Services is comparable to abstinence-only sex education.  In fact, Harbour Support Services have some good things to say on the issue and certainly seem to have the right idea. 

But the article says some slightly worrying things too:

Durham Police neighbourhood policing team officer Kathryn Davies and beat officer David Gibson delivered the third workshop addressing inappropriate use of social media including sexting and how easy it was to fall foul of the law.

It’s important to know your rights but I’m not convinced by terms like “inappropriate”.  Inappropriate to whom?  Aren’t we ultimately talking about safety here, rather than what someone else judges inappropriate?  And I’m not sure that the best people to teach kids about the legal implications of their online activities are the police.  But I’m judging without being there, they might well have done an excellent job and it’s the journalism that’s a bit suspect.

Mr Duckling explained that domestic violence could be physical, emotional, financial and sexual. It affected men, women and children and Harbour was there to support victims and work with perpetrators.

That sounds more like it.  It’s an angle that often goes unnoticed.  The headlines of stranger-danger overwhelm those of domestic violence and bullying, but are in fact the greater concern.  The issue of how abuse sufferers can safely find and use the resources they need is of paramount importance.  So too is the freedom of young people to enquire and explore, to push the boundaries and define the parameters of their own safety.  The goal (at least, my goal) is to show how young people can push those boundaries responsibly.  For example, if a young person needs to hide activity from her parents, she should ensure someone she trusts knows what’s going on in case something bad happens.  If she feels she can’t trust anyone with that information, she probably shouldn’t be doing whatever it is in the first place.

PCSO Gibson stressed the importance of young people keeping their online profiles private.

Don’t get me wrong.  The stranger-danger stuff is important and a fact that should be reinforced.  But I think it’s a more subtle issue than many approaches consider.  Common sense measures like this should definitely be observed, but it’s easy to mistake security counter-measures as security: taking care in some respects doesn’t necessarily make one safe.

Students were also shown poignant videos covering a variety of cyber safety issues and hate crime scenarios.

It would be interesting to see those videos.  We’re not very good at understanding the consequences of our future actions or at connecting things that happen to things we did in the past,

This Harbour outfit looks interesting, at first glance.  I’ll check them out.  Perhaps it will be a good way to inject some knowledge and expertise from ORG into places it’s needed.

Monday, 4 May 2015

Amazon pushes us toward more surveillance

Amazon has doubled the minimum spend required for its supersaver delivery service.  Customers now need to spend a minimum of £20 to qualify.  The article speculates that this is due to Amazon losing £36m last quarter but I tend to agree more closely with Mader from the Kantar Retail consultancy, who the BBC says is an expert:

There is always a pressure on Amazon from its investors to increase profitability, but I think the bigger factor is trying to shepherd people into Prime membership as well as improving the margins on each basket.

Prime is clearly the long-term strategy for Amazon and – which is always suspicious – you get quite a lot for your £79 a year.  The Prime free delivery service alone is attractive if you order a lot of stuff from Amazon, and of course we’re more likely to order preferentially from Amazon if delivery is free.  But Prime comes with extras, most notably its on-demand video service.  It has a lot of good content, much of it free to Prime users.  “Ow, right in the privacy” as a friend just said.