Monday, 27 March 2017

Wait, a politician was WRONG?

Surely not.  But Amber Rudd has a history of being wrong both on and off the internet and she's in no mood to break that streak.  Of wrongness.  I need to get better at metaphors.

Anyway, the UK Home Secretary, Amber Rudd, says that the ability of people to use encrypted communications is "unacceptable" because terrorism.

Let's not focus on the fact that the recent Westminster Terrorist was acting alone or that law enforcement intercepting his WhatAapp messages presumably wouldn't have prevented him from killing and injuring people. Instead, let's focus on the fact that he - and millions of other people - used a messaging service that happens to use encryption.  Plainly it is encryption that's at fault here, not bad people doing awful things.

Rudd either doesn't understand or pretends not to understand that tapping someone's phone is not at all the same as intercepting their encrypted messages.
"It used to be that people would steam-open envelopes or just listen in on phones when they wanted to find out what people were doing, legally, through warrantry," she said.
And that's fine.  Surveillance is sometimes necessary and by definition invades the privacy of the person being surveilled. Traditional surveillance such as this results in collateral damage, too, which is regrettable: innocent people who happen to call a terrorist are targets for further investigation, for example. But even I agree that some degree of surveillance is needed for general safety. People need to be followed. Rooms need to be bugged. Phones need to be tapped.  Knock yourself out.

What Rudd either doesn't know or pretends not to know is that tapping someone's phone, bugging their rooms or following them about are fundamentally different to intercepting their encrypted
messages.

Here's why: tapping a suspect's phone doesn't automatically tap everyone else's phone.  Decrypting a suspect's messages pretty much does.

There's no way to provide a back door to encrypted messages that only law enforcement can use.  Criminals will almost immediately gain access to any such back door through either hacking or good old-fashioned extortion or bribery. If there are back doors, criminals will have the keys within a few hours at most.

But that's not even the greatest threat.  Missions will creep.  Introduce cryptographic back doors and governments will very soon be decrypting people's messages to assist in enforcing parking fines.  That's a small step from the mass use of communication data to mine for suspects.  We don't need to rely on literature to tell us how bad an idea that is, real-world examples abound. To pick one of the (relatively) less horrifying examples, the United States of the late 40s to mid 50s suffered from almost literal witch hunts targeting people deemed disloyal or subversive.  Needless to say, the definitions of those charges changed on the basis of convenience.  All it took for the government to take action against someone they didn't like was a noisy accusation from a member of the public that they were a communist.  The only possible aim of mass surveillance is to associate people with labels based on arbitrary criteria. History - in this case and many others - tells us why this is a bad idea.  Data mining can cheerfully construct any label about any person. It's a means of constructing evidence for an already decided conclusion.  We only have to look in detail at, say,  the arrest and conviction rates of white and black people across the US to understand that this is a really terrible idea.

Surveillance is OK as a general concept but already widely abused.  Mass surveillance is never OK.  It's about fitting people to crimes rather than crimes to people.  It's about removing freedom in the name of freedom.

Make no mistake: Amber Rudd's call for law enforcement agencies to have access to encrypted communications is a significant step toward mass surveillance.  Remember also that surveillance infrastructure and culture is not something you can easily take back.  The opposition party tutting at laws that increase surveillance will certainly use and expand them once they are in power.

The good news is that governments can't prevent people from encrypting things.  The bad news is that they can criminalise people who do.

Fortunately, we're getting better at hiding the fact that we're using encryption... Suddenly all those years of wearing a tinfoil hat are looking pretty fucking well-spent.

Saturday, 25 March 2017

Judges tell woman she can't have a divorce

A UK court has told a woman she can't have a divorce because, apparently, she should expect unhappiness.  In Britain the grounds for divorce are: adultery, desertion, unreasonable behaviour, five years of separation or the agreement of both parties.

This makes absolutely no sense.  If someone has decided that their marriage is over then surely it is and there's no sense at all in a state forcing it to continue.

What's worse is that the partner seeking the divorce might need to rely on the assets of the marriage in order to live.  Ruling, therefore, that unhappiness is not grounds for divorce does nothing but condemn the unhappy partner to continued misery and/or financial difficulty.

It's a fact, of course, that the law is the law and courts don't - and shouldn't have - the power to change laws in the progress of a particular case.  But then, look at the ruling:
It is plain from his judgement that Judge Tolson was unimpressed by the wife's petition. He variously described it as "hopeless", "anodyne", and "scraping the barrel". He said it "lacked beef because there was none". He said the allegations are "at best flimsy".
He said "In reality I find that the allegations of alleged unreasonable behaviour in this petition - all of them - are at best flimsy. I would not have found unreasonable behaviour on the wife's pleaded case. As it is, having heard both parties give evidence, I am satisfied that the wife has exaggerated the context and seriousness of the allegations to a significant extent.  They are all at most minor altercations of a kind to be expected in a marriage. Some are not even that."
Wow.  Bitch should know her place, I guess.  Tolson isn't done, though:
I will not overburden this judgement by setting out the pleaded allegations in full. This, the wife's best case, skillfully argued by leading counsel, proceeds by emphasising what he submits is her increased sensitivity to the husband's old-school controlling behaviour. [...] Having seen him, I hope the husband will forgive me for describing him as somewhat old-school. I can also find the wife to be more sensitive than most wives. It matters not.
He... wants the husband's forgiveness for calling him old-school and not the wife's forgiveness for condemning her to a life of misery?  He's arbitrarily decided that she's "more sensitive than other wives"?  What if she were? Wouldn't that mean that she'd be even more miserable?  Wouldn't the husband's behaviour be even more unreasonable if the wife were especially 'sensitive', whatever that is supposed to mean?

The law is entirely arse-about-tit.  Promises are only meaningful if they are not enforced by law, especially if those laws are plainly biased because, you know, old white rich male judges.  This case is an example of such a judge deciding what's reasonable in a relationship and what isn't.  His contempt for the wife in this case is evident.  His support for the husband is obvious.  He wants to punish the wife for complaining and reward the husband for being a cock.

Let's get rid of the idea that some fault is required to dissolve a marriage.  If one partner wants to end a marriage, let it be so.  Let marriage be the contract it so plainly is: a sharing of assets which need to be properly distributed when the marriage ends.  A set of shared responsibilities which have to be properly untangled when one or more partner decides to end the contract.


Thursday, 23 March 2017

Scott Adams endangers future generations with his idiocy

Dilbert creator Scott Adams likes to refer to himself as a “master persuader”, apparently on the grounds that he took a hypnotism class decades ago.  He is neither very persuasive nor masterful when he says things like this, though:

http://blog.dilbert.com/post/158630186091/i-declare-mobile-phone-carriers-to-be-enemies-of
I Declare Mobile Phone Carriers to Be Enemies of the State
Here’s the basic problem.
Kids as young as eleven have smartphones. That situation won’t change.
A kid with a smartphone has access to any illegal drug in the world, as well as all the peer pressure in the world.
Pills are small, cheap, odorless, widely available, and nearly impossible for a parent to find in a bedroom search. When you have this situation, the next generation is lost.
Where to begin?  Apart from the one-sentence paragraph apparently being part of the master-persuader's toolkit, I guess.

But getting back on topic I could explain why paying for drugs with credit cards or Paypal is rather difficult (but not impossible).  I could explain that while other solutions involving proxies and/or cash are certainly possible, they are also risky to all parties.

But there’s no need at all for me to do that because Adams’ premise is absolute bullshit.

Kids can get access to any drug in the world anyway, regardless of whether they own a smartphone. There’s the internet, which Adams has forgotten can be used from devices other than phones.  There’s the dark/deep web. Then there’s the good old-fashioned ways people have been using to buy drugs for generations.  They’ve stood the test of time for good reasons; they’re adaptive and minimise risk.  If kids are buying drugs, they are vastly more likely to be dealing in person and in cash than using their phones and/or hypothetical credit cards.  Because kids – whatever else they might be – are not necessarily stupid.  Neither are drug dealers.

Smartphones are not automatic gateways to untraceable drug transactions.  In fact, they are about the worst possible method.

But why are these hypothetical kids buying drugs in the first place?  Peer pressure, according to Adams.  The problems with that argument are that not all kids succumb to peer pressure and that peer pressure isn’t always in favour of drugs.  In due course I’ll get onto Adams’ ‘argument’ that “good parenting isn’t enough”.  For now, I’ll say that his glib assessment that children – on a generational scale – will buy drugs just because they can is bullshit.  To be clear, remember what he wrote:
When you have this situation, the next generation is lost.
Kids of my generation and many before could buy drugs very easily indeed.  I (very) occasionally bought drugs in my youth.  Some of my peers sometimes did, some didn’t.  The availability of drugs is not a conveyor belt.  That’s why not every child smokes, drinks, snorts or shoots up.

In any case, the generation wasn’t “lost” in any sense.  We continue to contribute to society in all sorts of ways.  If ease of drug purchase was tantamount to lost generations then we’ve been lost since way before Adams was born.  And if we have been lost all this time, smartphones certainly – demonstrably - aren’t to blame.

Having proposed this already wrong premise, Adams explains how to solve the non-problem:
To address the problem, you would need the phone companies to allow parents full access to all messages on a kid’s phone. And this feature should be mandatory, not optional. Parents need to see all messages, and all photos, from all apps.
There are two major points here.  The first is that this would absolutely not prevent kids from buying drugs.  They just wouldn’t use their phones to do it.  The second is that such monitoring would likely cause many kids to take greater risks (irrespective of drug-buying or otherwise) than they otherwise would.

A couple of examples of how over-surveillance can create risk:
  • Young people who knew their phones were being monitored would not use them for sensitive communications.  What child would communicate secrets on an open channel when there are so many secret channels they could use instead?  Why wouldn’t they just organise illicit activities using other means?  Why would they take their phones with them when they went to places they’re not supposed to? When you rank the negligible probability that your kid will buy drugs in the most stupid and risky way possible greater higher than  that of their getting into real and immediate danger because of unnecessary surveillance, then you have a shitload of justification to do.
  • Monitoring a child’s communications also monitors those of their friends.  This has been shown to cause serious problems.  I know of two cases where a child described the abuse she was undergoing to her friend.  The friend’s mother snooped her phone, saw those messages and confronted the abused child’s parents.  This made the situation vastly worse.  No doubt the parents were well-meaning, but they weren’t equipped to deal with the situation properly.  And that’s without even thinking about the ethical considerations of snooping the communications of other people’s kids, let alone those of your own kids.
I don’t know what Adams means by “mandatory, not optional”.  I assume he doesn’t mean that every photo taken, every text sent etc. immediately pops up on the parents’ phones, but he could well be suggesting that, I wouldn't be surprised.  It seems as though it would do a lot more harm than good.  Should kids think before they take a photo or send a message?  Sure.  Should they be terrified to do so in case they are arbitrarily punished for doing so after the fact?  I think not.  In one scenario, they’re being trusted enough to learn how to grow up.  In the other, they are being forced to adhere to arbitrary rules they might not understand or agree with and are not being allowed to develop a personality.  They’re in a Skinner Box.  Let's not put kids in a Skinner box.

If Adams instead means that parents should be allowed access to their kids’ phone activities in case it becomes necessary, the situation isn’t much better.  In this scenario, it’s the parents who get to decide what needs to be seen and what doesn’t.  This might work OK in situations where there is deserved trust between parent and child but not all such relationships are like that.  Consider the abused child asking for help.  Consider the gay or trans child of fundamentally religious parents trying to understand themselves.

Legislating that parents always get to see all their children’s phone activities is going to put some children in extreme danger or limit their ability to get out of danger.  But Adams doesn't understand what parenting is.
I know what you are going to say. You’re going to say good parenting is all you need. But my observation is that no more than 20% of kids can be “parented” away from temptation. The other 80% are totally out of luck.”
No you don’t, Scott. I’m not even sure you know what you're going to say, half the time.  I’m not about to say that good parenting is all anyone needs, but let’s deal first with your “observation” that “no more than 20% of kids can be “parented” away from temptation.”

I’ve been staring at that sentence wondering what it can possibly mean.  I mean, to begin with, citation fucking needed and then “parented away from temptation”?  I… I… no, I don’t even.  Look, Scott, children are going to be tempted by all sorts of shit.  They’re going to succumb to some of those temptations.  It’s not even necessarily wrong if they do.  It’s called learning.  It’s called growing up. People have to make mistakes and rather than hovering over our children to prevent mistakes ever happening, we have to be there to pick up the pieces if they do.  That’s parenting, Scott.  It's interactive. I guess you didn't think of that.

Functional relationships are largely about trust, which is a mutual thing.  Can you trust absolutely that your kid won’t try drugs?  Absolutely not and I’m not convinced that you should be able to.  But can you trust that your kids will come to you for help when they need it?  That’s almost entirely up to you and if you’re surveilling them, imposing arbitrary rules, forcing them to divulge their communications and whereabouts at all times then newsflash: they aren’t going to trust you. And with excellent reason.

You can’t “parent away” potential drug use (whatever that can even possibly mean) but parenting is about a whole lot more than that.  It terrifies me that Adams might one day be a parent when he obviously doesn’t understand that simple fact and is so clear about how other parents should be.
My observation is that smartphones have made half of all adults mentally ill. I mean that literally, not figuratively. The business model of phones is addiction, not value. And they addict you at the expense of the things humans need in their lives to be happy and healthy.
Adams isn’t qualified to diagnose mental illness.  He’s also apparently (despite being trained as an economist) unqualified to identify business models.  The business model of phones isn’t addiction and there’s no evidence to suggest that phone addiction exists.  The business model of phone companies is about fooling us into revealing as much information about ourselves as can be wrung out of us.  I agree that there’s a problem here and I’ve written about it often, but the effect cannot be characterised as a mental illness. And Adams very, very clearly doesn't understand the phone company business model. How embarrassing.
Today I declare the phone companies to be enemies of the state. They are ruining everything you love, and everything you care about. And they are doing it right in front of you.
I love my cat.  She is un-ruined by phones.  I love the views from my house. I love the part of the world I live in and how it changes over the seasons. I love my spouse.  I love my friends.  I love science. I love engineering.  I love solving problems. I love crosswords. I love reading. I love logic. Not necessarily in that order.

Phone companies aren’t ruining any of those things.

What’s left in this article?  Oh yes, the usual Adams embarrassing bullshit:
If this is not already obvious to you, it probably means you’re a smartphone addict. A normal person’s brain will spontaneously generate a prote ctive illusion to support an addiction. If you see no problem with smartphones causing drug addiction in kids, or you think I am exaggerating, you’re probably in the illusion.
Yeah, this is part of what Adams thinks of as masterful persuasion.  Do I even need to pick it apart?  The false dichotomy hurts. If you're not part of the etc. Scream if you want to go faster, Adams fans.
I’m going to delete any comments that say good parenting is all you need. That opinion would not be worthy of this topic.
Make of that what you will.  I'll delete any comments that don't rhyme with "clickhead". They won't be worthy of my mighty yet randomly idiotic understanding of something or other I can't define.

Tuesday, 5 July 2016

Your supermarket is spying on you and other shocking news

It’s easy to forget that we already live in a dystopian future. The breadcrumbs of personal data we scatter around us wherever we go are already being collected, aggregated and analysed to an extent where harmful privacy breaches are practically inevitable.

We’re under surveillance everywhere we go, including during our weekly supermarket shop. Surveillance in supermarkets is nothing new. For decades they’ve offered store cards which collect detailed information about your purchasing habits in exchange for (frankly insulting) incentives. The data collected includes when and where you shop, what things you routinely buy, when you tend to buy extravagant things, how likely you are to take advantage of special offers and much more. It is used to come to surprisingly complex (and often accurate) conclusions about you, your family, your lifestyle and your family’s lifestyle. With schemes such as Nectar in the UK, even more data is collected and more valuable conclusions drawn, because data is collected from a wide variety of shops of different kinds, moreso if you also have a Nectar credit card.

Many people feel that this information is a fair price for the incentives offered. Privacy activists like me disagree…. but that isn’t the point isn’t the point I’m making here. The point I’m making is that surveillance via store card is the tip of the iceberg. There are plenty of other very creepy things supermarkets routinely do and more that will surely appear in the near future.

With online shopping, supermarkets are able to track the items you buy, the ones you look at but don’t buy and the ones you buy instead. In physical stores, this has not been possible until recently. Nowadays, supermarkets can and do track your movements around their physical stores. If you (like most people) leave your phone’s wifi turned on when you visit the supermarket, then the store can track your movements, even if you don’t connect to the store wifi. From this, it can build a very intimate picture of your shopping habits. For example, it can determine how long you stare at a shelf of near-identical brands of washing powder before deciding which to buy and can compare that with your past behaviour. How easily are you affected by the specific placement of certain items? Can you be manipulated into buying the one with the highest profit margin? It can note the things people always forget to buy as they walk around the store and have to go back for. Are there trends in this data that can manipulate customers into buying things they don’t really want? Could the supermarket put a shelf with the things everyone forgets right at the far end of the supermarket, but put only the brands with the highest profit margins on that shelf?

With data like this, already being collected by supermarkets, coupled with eventual buying choices and place/time data collected by use of a store card, supermarkets can build a very intimate picture of what their customers buy and how they make buying decisions. This is not data you’d want to fall into the wrong – or even the ‘right’ - hands. And it’s set to get worse. Trials are already being carried out on various means of monitoring shelves so that data can be collected about which items a customer picks up but doesn’t eventually buy and which items they most closely scrutinise (and then whether that’s the one they buy). There are also trials being carried out on automatic expression recognition via CCTV. What are you thinking when you look at a product or display? Pleased? Excited? Disgusted? Bored? Confused? Will supermarkets also start to use automatic facial recognition to track those of us who turn our wifi off and pay in cash? I don’t see why not.

Data like this is used to create profiles of customers, to optimise displays, shelving and pricing and to offer customer incentives such as sales and coupons. It’s up to individual customers to decide whether they think this is creepy and manipulative or genuinely useful, but it’s not up to customers whether they are tracked in the first place.

The data is very valuable, of course, and will not be used only to decide where to put the baked beans. It (or subsets of it) will be sold to other companies, which will aggregate it with the things they know about us. And it will be stolen by people who want to use it to also steal our identities. It will be used to draw possibly false conclusions about us, which might haunt us in the future. If you don’t look sufficiently concerned when putting high value items into your trolley, will you be considered poor credit risk by a completely different company in the future? Will health insurance companies count how many doughnuts you bought and look at your waist size as calculated from CCTV footage over time to decide whether to pay for your heart attack? I’m being flippant, but I don’t think these are particularly unrealistic scenarios.

Store cards are all very well. I don’t have one but I don’t look down on anyone who does. For them, the trade-off between privacy and money-off coupons is worth it. I’d argue with them that the trade-off only seems worth it because they probably don’t understand how their data is being used and misused, but that’s OK too; taking the time and effort to understand these things is a cost many people don’t think is worth paying. It’s up to them. They’re helping – like parents refusing to vaccinate their children – to create an environment that’s more dangerous for everyone else, but I think we have a little way to go yet before most people really start to see the downside of this abandonment of privacy. I’m not saying I won’t gloat when they do, but I understand why it’s difficult to take privacy seriously when it comes at the expense of convenience.

But while store cards are opt-in, the other surveillance methods employed by supermarkets are not. If I have to turn off my phone wifi, pay with cash in unordered notes and wear a disguise to the supermarket, then I can’t honestly say I’ve been given a realistic opportunity to opt out.

But I can’t end on a negative note. I think there are some excellent uses for store cards and supermarket tracking. Here’s my suggestion:


Your store card is issued 100 points when you enter the supermarket, to be redeemed upon checkout. This number ticks down the longer you are in the store and ticks down even faster whenever you stand still. That way, perhaps everyone in the supermarket will finally get out of my fucking way.

Friday, 20 May 2016

Free Chelsea Manning

This is not an offer to get a free Chelsea Manning, it's a post about why she shouldn't be in prison: https://boingboing.net/2016/05/19/eff-files-chelsea-manning-appe.html
The issue with Manning being found guilty of a criminal offense for violating her employer's terms of use is one of precedent: if Manning is sent to jail for violating the fine print in her employment agreement, then so can anyone else who breaks their own employer's terms and conditions. That means that most of us could be sent to jail for things we do every day.
This isn't hyperbole.

13858609833_3ab5048441_bI once signed an employment contract that prohibited 'horseplay'.  And a flat rental agreement that was altered to allow a pet which read "providing that the pet in question is not a horse kept in the bath".

The latter was a joke by the estate agent, but the horseplay clause (not sure what's going on with all these horses) was completely serious.  It was a clause designed to be deliberately vague so they could sack people with impunity for arbitrary reasons.

Of course, this was a company which had a light sensor to turn off the security lights, positioned directly in the glare of a security light.

We also built a test rig which was so poorly earthed that it was vitally important that you went to the toilet before touching it.

More spying on kids

There are some very worrying online resources about how to spy on your kids.  Many of them assume that spying on your kids is the right thing to do.

Some, like this one, purport to ask whether it's OK to spy on your kids but also include statements about the alleged tactics kids use when they find out you're spying on them.  Almost as if they answer to the question is self-evidently "yes".

That particular link contains the following on when you shouldn't spy on your kids:
Image result for spying on your kidsIf you have a teenager who meets her responsibilities, comes home on curfew, is where she says she’ll be when she said she’d be there, is hanging out with the people with whom she said she would be hanging out, and you have no reason to be suspicious about anything, I suggest you stay out of her room. And I think you should tell her that, too. You can say something like, “I’m not going to interfere with your privacy, because you’re doing so well. I have no reason not to trust you.” That way, she knows she’s being rewarded for her behavior—your lack of interference in her personal space is a direct result of her actions.
Yeah, that's.... creepy.  And nonsensical. Threatening to invade your kids' privacy if they don't behave exactly according to your standards isn't going to develop trust and likely is going to foster risky behaviour.
So when you spy on your otherwise responsible child, the message you’re sending is, “I don’t trust you, even when you haven’t done anything wrong.”
Way to miss the point.  Kids are going to make mistakes.  The way to deal with that is to talk about it and then let it go, not to invade their privacy in a doomed attempt to prevent them making further mistakes.  Mistakes are how we learn.
To be honest, I don’t like talking about rights; the word is just too overused in our culture. But here’s the deal: I believe that whoever’s name is on the mortgage has a right to look anywhere in their house. In my opinion, that’s your right because you own the house. 
Yeah, the thing here is that kids are people and you don't own them.  It's not like they even have much of a choice about where to live.  After all the rhapsodising about how it's a parent's responsibility to keep their children safe, we get this:
Many parents will ask, “Why should I tell him I’m going to [search his room]? He’ll only hide it outside of the house.” But that’s not your problem as a parent.
Yeah, as long as presumably dangerous activity isn't happening under your own, mortgaged, roof, there's nothing to worry about.

Here is a (hilariously inept) instruction manual for how to spy on your kids. That's it's actual title - "how to spy on your kids online". This isn't beating about the bush.  But to be fair, there's some good advice in that article, hidden amongst the bullshit.

This is sort of sweet:
And be warned: Kids can learn how to delete the history to cover their tracks, so ask questions if you discover that the history was cleared by someone other than you.
Yeah, if your kids aren't smarter than that, you probably have more problems than you think.  Especially if you're not smarter than that either.
With most issues of safety -- climbing a tree, riding a bike, crossing the street -- we progressively give kids more freedom. But in the digital world, new and different risks come up as they grow. Your instinct might be to back off as they approach the tween years, but that's when to get even more involved.
This is not a clear thinker.  There is a considerable difference in risk between a four-year-old climbing a tree and a 14-year-old climbing a tree. They're climbing different trees.  There are 'nImage result for spying on your kidsew and different risks [in tree climbing] as they grow'.  The last sentence is worrying on every level.

The article ends with a very telling 'decoding' of some common abbreviations.  The author seems at least as concerned with figurative use of the word "fuck" as about actual safety.  As I said, telling.

 Abbreviations and code words speed up instant messaging and texting, but they also mask what people are saying! Brace yourself. Here are some commonly used terms:
ADIH: Another day in hell
A/S/L: Age, sex, location
BTDT: Been there done that
CULTR: See you later
GTFO: Get the f-ck out (expression of surprise)
H8: Hate
ILY or 143 or <3: br="" i="" love="" you="">JK or J/K: Just kidding
KWIM: Know what I mean?
LLS: Laughing like sh-t
LMIRL: Let's meet in real life
LYLAS (B): Love you like a sister (brother)
NIFOC: Naked in front of computer
PAW or PIR or P911: Parents are watching or Parent in room (drop the subject)
POS: Parent over shoulder (can also mean "piece of sh-t," used as insult)
Pr0n: Intentional misspelling of "porn"
STFU: Shut the f-ck up (expression of surprise rather than reprimand)
TMI: Too much information
TTFN: Ta ta, for now (goodbye)
WTF: What the f-ck?

Kids are complicated

Kids are complicated. They need privacy.
In this article, Livingstone walks us through the daily routine of her research subjects -- the way networks ebb and flow through their face to face interactions, family time, homework and leisure. Her account sharply highlights danah boyd's finding from her indispensable book It's Complicated, that teens prize face-to-face time above computer and phone time, but it has to be time with their peers and away from adult supervision -- a rare commodity in the era of bubblewrap child-rearing.
I've come across a few real life reasons why kids need privacy.  In one case, a girl was being abused by her parents and confided in a friend. They spoke about it on their phones.  The parents of the friend snooped her phone, found messages about the abuse and confronted the abusers. This made things much worse for the girl and put her in even more danger.

Image result for spying on your kidsI'm not suggesting that the abuse should have been kept secret.  Clearly the abusers needed to be stopped.  I'm saying that snooping your own kid's phone can have dire consequences for other people.  Because kids are complicated.  In this case, the assumption of privacy was vital; the girl likely wouldn't have confided in anyone if she thought it would be intercepted.  If your kids know you're snooping their phones, they won't use them to communicate about sensitive things.  If they know you're tracking their phones they won't take them when they go somewhere without their approval.

Spying on your kids is likely only to put them at greater risk.  Fostering a trusting environment is a lot more difficult but obviously superior.  Respect your kids, Accept their need for privacy.
 

Monday, 16 May 2016

Restricted mobility


Another example of restrictions in digital mobility is China's great firewall.  Chinese citizens face severe restrictions to their online activities. However, it isn't only traditionally oppressive governments and regimes that restrict people's digital mobility.  The current UK government is trying it's best to remove our digital autonomy too.

Image result for mobilityMobility is crucial to privacy and is generally considered a basic human right.  If you're not free to go (more or less) where you want, it becomes much more difficult to have secrets.  This is especially worying when governments place restrictions on their citizens' mobility because it becomes more difficult to express and share negative views about that government and to effect change.

It's not only governments that can restrict people's mobility.  Other groups can do so too (more on that in a moment) but mobility can also be restricted by circumstance.  Illness, lack of money and responsibility can restrict mobility, so it's important to support people with such restrictions.  The most important tool we have for this is the internet and digital mobility should also be considered a basic human right.

Needless to say, therefore, governments and oppressive organisations are keen to restrict their citizen's digital mobility as well.  Here's a particularly illustrative example:
When people sign up to fight for ISIS, their passports and mobile phones are immediately taken away.  There are many who immediately regret their decision to join ISIS, so both their physical and digital mobility are severely restricted throughout the term of their military and religious training. 
Once training is complete, their phones are returned.  Make no mistake, though, this isn't a return of digital mobility; by then the soldiers know better than to use their phones to contact the outside world.

First, there's the porn filter.  Everyone in the UK who gets a new internet connection must inform their ISP if they wish to opt out of using a filter that supposedly screens out pornography.  Aside from the fact that porn filters don't work, having to opt out of the filter is an oppressive mood, albeit a fairly mild one.  I'd personally rather avoid telling the government that I want to opt in to porn, not because I'm embarrassed but because it's information that the authorities might use against me in the future.

Then there's the government's determination to ban encryption without a backdoor.  The possibility that governments (or criminals) can snoop our private conversations at will is a more severe limitation on digital mobility.

This is why it's so important to oppose these measures.  They compromise our digital mobility and, as usual, the most vulnerable members of society suffer the most.