Wednesday, 17 June 2015

South Korea mandates spyware on young people’s phones

The government of South Korea has ruled that smartphones belonging to people under 19 must contain an app to monitor their internet activity, Image result for smartphone spywareostensibly on the grounds that their parents want to be able to see what they’re up to and to block access to ‘undesirable’ sites.  Phone manufacturers will somehow make sure that phones won’t work unless the spyware is there.  Which sounds rather easy to get around to me, but that’s beside the point.

The point is that there are exactly two reasons for the government to do something like this.  Either they want to make parents responsible for what their children do – worrying thing indeed in an oppressive regime – or they have back doors into the spyware.

The government has developed its own monitoring app called Smart Sheriff, but there more than a dozen alternatives on the market.

It seems likely that the only relatively reliable way to block phones that don’t contain spyware is to have an approved list of spyware.  Approved, that is, by the South Korean government.  It’s hardly unrealistic to assume they have backdoors into these approved ‘alternatives’.

The younger generation is always the one of most concern to oppressive governments.  They are the most likely to see injustice and intolerance for what it is.  They are the most likely to organise and protest.  They’re the most likely to feel like they have the least to lose.

So forgive me if I’m not entirely convinced by the South Korean government’s stated motives.

Privacy vs convenience: hidden privacy costs

The price of convenience is often loss of privacy, we know that.  When we register for an e-commerce site to avoid having to type in our details every time we buy something, we’re usually giving up valuable PII (personally identifiable information.  When we sign up to a number of social media services so we can maintain a common, persistent pseudonym across those services, we’re giving up some measure of anonymity for the convenience of being recognised as the same person in different but related spaces.

But it’s a bit more complicated than that.

First, we’re all notoriously bad at making those kinds of privacy bargain.  That’s largely (and obviously) because the companies that want our data don’t want us to know how valuable it is to them or how much we might regret the consequences.  Usually, they try to play those things down, to make it seem that we’re getting a good deal.  Don’t get me wrong; in some cases some deals are good ones for some people.  What’s wrong is that customers aren’t usually given enough information or sufficient choices to decide whether or not the convenience is worth the price.

But it’s more complicated even than that, because many companies charge us in privacy for convenience we don’t receive, have even signed up for or even necessarily know about.  Here’s an example: Facebook tracking non-users. We’re losing privacy but not gaining anything in convenience.  At least, not from Facebook.  It could be argued that we’re gaining convenience in a general sense, because cookies are useful.  In this view, Facebook is just leveraging general convenience we get from the web for it’s own ends.  In fact, that’s Facebook’s excuse:

It also defended its actions when the Belgian commission released its report last month, saying that most websites used cookies, which it said has been an "industry standard for more than 15 years".

That’s disingenuous.  Companies like Google, Facebook and Amazon claim to take privacy – and their customers’ concerns – seriously, but don’t hesitate to exploit them without permission or notice.  These hidden privacy costs are the ones we should be complaining about most.

Monday, 15 June 2015

Internet Magna Carta top ten

Yesterday I wrote about the British Library’s Magna Carta for the Digital Age.  The top ten clauses of “the web we want will…” as selected by young people and voted for by the British public are so far:

  1. The Web we want will not let companies pay to control it, and not let governments restrict our right to information
  2. The Web we want will allow freedom of speech
  3. The Web we want will be free from government censors in all countries
  4. The Web we want will not allow any kind of government censorship
  5. The Web we want will be available for all those who wish to use it
  6. The Web we want will be free from censorship and mass surveillance
  7. The Web we want will allow equal access to knowledge, information and current news worldwide
  8. The Web we want will have freedom of speech
  9. The Web we want will not be censored by the government
  10. The Web we want will not sell our personal information and preferences for money, and will make it clearer if the company/Website intends to do so

I can’t fault the sentiment.  There’s a lot of overlap, ambiguity and naiveté, but that’s not a bad thing.  The aim of the project is to find out what people want and it’s doing that. 

I haven’t voted myself because I know the choices would enrage me.

Saturday, 13 June 2015

US Navy openly solicits zero day bugs to weaponise them


You’d think armed forces and other agencies of government ought to be in the business of protecting their citizens.  It seems like that ought to be the point.  They keep telling us that’s the point when they ask for more money and power.

That’s one of the reasons it’s so frustrating when security services try to sabotage encryption or actively distribute malware onto the machines of citizens, guilty or otherwise; It’s the exact opposite of protection.

The US Navy will pay for your zero day bug reports so it can exploit them as a potential weapon.  A zero day bug is one nobody knows about, so no fix exists.  They’re the most valuable kind of bug to an attacker.  An attacker such as the US Navy.

Not that the navy (necessarily) wants to attack American citizens.  It wants to attack people in other countries that use the same software.  But it wants to do this by deliberately keeping American citizens vulnerable to the same attacks by other countries, rather than making the internet more secure for everyone.

As Cory Doctorow puts it, here:

The Navy, therefore, is seeking to secure America by ensuring that the "widely used and relied upon commercial software" that Americans depend on remains unpatched and vulnerable, so that it can attack its enemies, who use the same software, and they're conveniently ignoring the fact that their enemies can use those same bugs the Navy wants to hoard to attack American individuals, governments and companies.

The EFF are on the case. The Navy took down the solicitation after Dave Maas tweeted about it, but they (EFF) saved it here. They’re also suing the US government to disclose the process by which they decide whether to disclose information they have about vulnerabilities to the vendors of that software.  It seems unlikely that the Navy would spend big money soliciting bugs if they are routinely reported to the vendors, so the decision process seems like something we all really need to know about.

The EFF’s article about it is here.

Internet Magna Carta

Somehow I missed this.

Thousands of young people have taken part in a UK debate about what should be included in a "Magna Carta" for the digital age.

In case you don’t know, the original Magna Carta (“Great Charter”)  of 1215 was a document signed by King John to effectively curtail his powers and grant protection to various people and organisations (mostly nobles and, needless to say, the church).  It was largely ignored and was repealed and re-instated many times in different forms, which sounds like an ideal analogy for any such agreement for the digital age.

But it’s an interesting and fun idea.

The public can now vote online for the clauses [the thousands of young people suggested, in a project organised by the British Library.

I’m all for that as an exercise to find out what people want and how they think about things, even though it won’t have any effect on how laws develop.  Not everything has to be practical.

Their most popular priority was safety on the net, followed by protecting freedom of speech and privacy.

You see? That’s encouraging news.

According to analysts ComRes, 29% of people aged 10-18 opted specifically for safety while 17% chose freedom of speech as a clause they wanted to support.

I don’t know whether those votes overlap.  It would be strange if they could only choose one.  Freedom of speech and safety are hard to untangle anyway; freedom of speech is about some kinds of safety and requires safety. Safety requires, among other things, freedom of speech.

"Nearly half of the clauses talked about students wanting to feel safe and protected online," project manager Sarah Shaw told the BBC.

"We thought there would be more talk about freedom online, and not so much talk on more of a conservative manner."

Shaw seems almost disappointed that the young people said things she didn’t expect. It doesn’t surprise me very much that they valued safety above freedom, especially since those categories overlap so much.  It would be interesting to see the demographics of the people taking part.  Are many of them likely to have come across serious curtailment of their freedom? I’m not sure freedom was something I thought about until much later in life, but then I pretty much ran wild as a kid.  I knew about safety, though. We had stranger-danger drilled into us at primary school as well as (at least for us rural kids), the notorious and unintentionally hilarious public safety film which featured an entire group of kids meeting horrific ends on a farm. Besides:

Seven of the existing top 10 clauses mention freedom.

That seems like a lot to me.

"Several clauses talked about wanting cyber-police," she said.

We tell young people that the police are their friends and there to help them. An extraordinary idea to be sure but we should hardly be surprised when they suggest an internet police force.  I’m sure they’re thinking of a group of people with special powers who are there to protect us from harm, which is what we told them the real police were in the first place, for some reason.

Anyway, an interesting, if limited, experiment. It’ll be interesting to find out what varieties of young people they asked.

A roundup

This week, Evil Wednesday is Evil Saturday.  Or is that the other way round?  Either way, here’s a roundup of some stuff that happened.

France seeks to extend Google right to be forgotten.  At the moment, the right to be forgotten, which is an EU directive, only covers searches made via Google’s European sites.  France wants to extend this to Google’s global operations.and has given Google 15 days to comply before considering sanctions.

Net neutrality win in the US….. for now.

A sensible way forward on mass surveillance? I’ll write more about this later.

Ebay and Paypal have decided they canrobocall you whenever they like.

In a rare moment of self-awareness, Reddit has shut down some of its forums for harassing individuals and recruiting people for harassing individuals. The BBC predictably pussy-foots around the problem by scare-quoting the word “harassing” in their headline.

Nominet built a tool for exposing cybercrime on .uk domains. I bet that’s not all it can expose.

Theresa May lays the groundwork for the Snooper’s Charter.

Stronger laws are needed on revenge porn. Yep, and we need to stop the slut-shaming too.

More fake mobile towers found in London, Met takes idiotic stance.

Facebook gets in on troubling beacon scheme.  Hmm.

Cory Doctorow: Anti-surveillance steamroller still rolling through congress. “The USA Freedom Act set the first legal limits on spying in a generation, and wereimmediately followed by 3 more surveillance-blocking amendments from the House, and now, a week later, there's 2 more bipartisan curbs on surveillance.”

14 fun facts about 1984.  Fun fact #15: It’s pretty clear that most people who compare something to 1984 haven’t read it.  Come on, people.  It’s not like Christians and the Bible, you can actually read it and you won’t be disappointed when you do.

How to tell if a mirror is one-way glass


.That’s all for now.  Normal service will resume sooner or later. It’s probably later, isn’t it?

Friday, 12 June 2015

Privacy resource update

I've been working on some practical privacy resources for all types ranging from tinfoil-hat-wearing paranoids like me (and probably you since you're reading this) to daily stuff we can do to understand our risks and make good decisions about how seriously we want to take them.  A few of these are nearly ready to be torn apart on the web.  Now I have some power to my laptop, I'll try and get at least one out this week.

It's going to depend on how many more interesting things I find to do in Malta, though.

I'm out of town for the next few days. WAY out of town

I'm in Malta and you, probably, are not. Which is a shame because it's nice here.  Even nicer now my luggage (and laptop PSU) is here with me after travelling around the globe without me for a few days.

We spent today careering around the old city of Valletta in an electric cart.  It works like this:

The cart has a GPS tablet which guides you around the streets.  This is the (very) old part of Valletta, so it isn't too busy.  When you get to certain points of interest, a disembodied voice tells you about it in a series of odd non-sequiturs.  At any point you like, you can stop the car and jump out to have a closer look at things, have a drink or whatever.  It's kind of like being in Jurassic Park.  The company is tracking the cart so if you go off-piste, they can contact you in the car to get you back on track.

The only problem was that for some reason the GPS kept cutting out, which sent us either round in circles or - in one especially stressful case - up a blind alley with about 2 inches of clearance at either side for a 28-point turn. In reverse, the throttle is binary being at either 0kph or 25kph so that maneuver was fairly fraught. The GPS instructions were also the standard ones that came with whatever box they used, which were not very good.  They could have added some advice about the particular route in the (many) ambiguous parts.  Teething troubles, the company hasn't been going long.

Still, the tour itself was very impressive, especially the part through the narrow medieval streets of the old city (pictured).

I really like what the company is trying to do and wish it every success.  The staff were all very friendly and enthusiastic (as was their dog) and despite the few hiccups, which were admittedly quite stressful, it was a good trip.

Now I have sweet, glorious power to my laptop, I might get around to posting something about privacy tomorrow.

Tuesday, 2 June 2015

I assume this is what I sound like when I talk about privacy

From a comment by PrivateSi on the Daily Mail article here:

I've been communicating to 'The Police' of all types for years.... If you support this much power to PC Plod you're a THREAT TO HUMAN INDEPENDENCE - who's removal is a threat to HUMANITY ITSELF......... The things that make us HUMAN, Brainbashed Pavlovian Animal Zombie Sheeple........ Do you really think, if SAY for instance PrivateSi The Guardian's Real Direct Democracy became popular that an UBER STATE would not use everything in its power to DEMONISE it....... Do you really, truly trust the Uber State NOT to FALSIFY EVIDENCE with legal ease.............. ?? If so you're a wrong un' in my book.

Trouble is, he’s right.  Even more troublingly, I’m well aware that’s how I sound to most people.

UK police make a personal metadata request every 2 minutes

Bigbrotherwatch reports here the results of some Freedom of Information requests that show how often UK police request access to someone’s personal metadata.  The answer is lots.

Between 2012 and 2014, 733,237 requests were made.  That’s about one every two minutes.  The report says that 93% of these requests were granted and most forces made requests more frequently over time.

It’s worth noting that this is all before the current government’s proposed Snooper’s Charter, which will make everything worse.

"You might as well skywrite it as encrypt it with pre-broken, sabotaged encryption"

Cory Doctorow again, this time from a 1st May article in The Guardian about David Cameron’s intention to break encryption, which I didn’t get around to commenting on at the time:
It’s impossible to overstate how bonkers the idea of sabotaging cryptography is to people who understand information security. If you want to secure your sensitive data either at rest – on your hard drive, in the cloud, on that phone you left on the train last week and never saw again – or on the wire, when you’re sending it to your doctor or your bank or to your work colleagues, you have to use good cryptography. Use deliberately compromised cryptography, that has a back door that only the “good guys” are supposed to have the keys to, and you have effectively no security. You might as well skywrite it as encrypt it with pre-broken, sabotaged encryption.
He goes on to explain why, but you know the drill.  There are technical arguments and practical ones.  Doctorow compares encryption back doors to the TSA’s requirement that all luggage flying through or within the US to use Travelsentry locks, which have an easy-to-get-hold-of master key, which opens them all.
What happened after Travelsentry went into effect? Stuff started going missing from bags. Lots and lots of stuff. A CNN investigation into thefts from bags checked in US airports found thousandsof incidents of theft committed by TSA workers and baggage handlers.
They’re even managing to smuggle the swag off airports where all staff are searched on leaving.
Making it possible for the state to open your locks in secret means that anyone who works for the state, or anyone who can bribe or coerce anyone who works for the state, can have the run of your life. Cryptographic locks don’t just protect our mundane communications: cryptography is the reason why thieves can’t impersonate your fob to your car’s keyless ignition system; it’s the reason you can bank online; and it’s the basis for all trust and security in the 21st century.
We can’t pretend to be surprised if the same thing happens to our personal data, should Cameron get his way.

TSA airport checkpoints miss 95% of weapons in test

Cory Doctorow is not surprised:

In part, this is because the TSA is staffed by clods and jobsworths who are capable of maintaining the solemn pretense that breast milk and nail files are existential threats to the aviation system.

They always look bored, hostile or both, to me.

Much more important is the fact that it is neurologically impossible to remain vigilant for things that never happen. If you ask yourself to maintain vigilance for incidences of common water bottles (which your adversaries persistently try to smuggle past you, both deliberately and accidentally) and incredibly rare, nearly unheard-of, statistical outliers of weapons and bombs and such, your brain will get very good at recognizing water bottles, largely by de-tasking and commandeering the neural stuff that's meant to be looking for AKs and plastique.

Monday, 1 June 2015

I'm not convinced this is a good idea

Iran launches state-run internet dating site.

The idea of any government running a dating site makes me shudder.

UN says encryption is essential for free speech

The BBC says so here.  For anyone reading this, it probably seems like a blinding flash of the obvious, but it’s not obvious to everyone.  Or it’s obvious to everyone but lots of people think free speech is less important than catching criminals, which is to say that they think free speech is not important at all.
The UN is releasing a report in June which says that the sort of thing governments are all the time trying to do to encryption – force it to have backdoors or be otherwise weak – will prevent people saying what they want – or need – to say to others without censure. There’s an advance copy here.
"Encryption and anonymity, separately or together, create a zone of privacy to protect opinion and belief," says the report written by David Kaye, a special rapporteur in the UN's office of the high commissioner for human rights.
Kaye gets it:
The tools to bestow such protection are essential, it says, given the "unprecedented capacity" governments, companies, thieves and pranksters now have to interfere with people's ability to express themselves.
Lacking such tools, it adds, many people will be unable to fully explore "basic aspects of their identity" such as their gender, religion, ethnicity, origins or sexuality.
Unfortunately, and pretty much by definition, the people in charge are rarely the people who need to act in the face of oppression, although they might be among those people with the most to lose if their communications are intercepted.  And history has shown that they are often among those with the most to hide.  Perhaps this is why governments are always so keen to ban encryption; their members know what they are getting away with and they assume everyone else wants to use encryption for the same sorts of reason.
The software acts as a "shield" for opinions against external scrutiny - a fact that is "particularly important in hostile political, social, religious and legal environments", says the report.
Very much that.  But it’s also a tool for preventing – or at least slowing – the spread of unwarranted mass surveillance by even benign governments.
The report acknowledges the need for police forces and other agencies to get at encrypted messages and other communications - but says this should be done on a "case-by-case" basis and should not be applied to a "mass of people".

That’s right.  Surveillance is often warranted and we’d certainly be a lot less safe if our police and security services couldn’t spy on people when there was a genuine and good reason.  The problem is when we’re all under surveillance or can be placetd under surveillance without a proven case or adequate oversight.  While Cameron isn’t proposing that just yet, breaking encryption now will make it a lot easier to take that step in the future.  And who knows what GCHQ is doing already in this post-Snowden world?