If you know me, you'll know that my eyes are constantly aching from rolling at the phrase "if you've nothing to hide, you've nothing to fear." One of the guiding principles and main motive force of privacy work is that most people believe this. It's the fact that most people believe it that makes the accidental or deliberate shedding of personal data so valuable.
I won't go into the reasons the phrase is wrong here, largely because I have learned through experience that if I start, I'm unlikely ever to stop. But I will go into some of the reasons why it's difficult to convince people that their privacy matters and then give some suggestions about how to do it.
It's easy enough to understand why people use that damnable phrase. I don't actually blame anyone for believing it, although my aching eye muscles could do with a break for at least a few minutes every day. The scale and malevolence of how we're all being screwed by the people who have our data is - deliberately - largely hidden. We're not told who our data will be sold to or what it will be used for. Fine print tells us that it "might" (or the even more insidious "may") be 'shared' with 'partners' but without any indication of exactly what data is being shared with whom or why. We have a tendency to think that if people aren't telling us things in capital letters then the things are probably not very important.
It's also hard to connect the consequences of sharing data with any negative outcome. It's unlikely that we'll ever connect an instance of identity theft with the box we ticked on a website nine years ago, for example. Plus, of course, we often give away data to get cool stuff (10th sub free! apps that anticipate our needs etc) and we don't want to give that up, especially since we don't always understand why giving away that data might be bad. Nor should we, necessarily. The benefits might indeed outweigh the harm for some people in some cases. The problem is that we're not equipped to make that decision, because of the deliberate machinations of the companies who make money from our data and the complexity of the landscape.
So people like me need to come up with convincing examples. How can ticking this particular box harm you in the future? This is hard, not because examples don't exist but because they have to cover that distance in time and place between the ticking of the box and the stealing of the identity. They have to show that it's in aggregation of data over a period of years that the greatest danger lies. We humans are not very good at internalising knowledge of that sort or at practising the regimen needed to do anything about it.
The examples I've had the most success with tend to be ones that show how poor privacy habits can screw our friends and family. I find this confusing - I hate my friends and family - but it seems to work for lots of people.
It's a very important point and one I harp on enough to contribute to the eye-rolling muscle strain of my friends and family (good - I told you I hate them): privacy is a group exercise. It would be good if we tried not to inadvertently screw each other the whole time through our own carelessness.
There are ways we can screw the people in our own networks through complacency and other ways we can screw complete strangers. Please don't take this as a manual of how to screw people, by the way, treat it instead as a way to be mindful of how our actions can harm others whether we mean to or not.
1. Harming friends
I frequently talk about the Amazon gift service because it is such a perfect example. You're an Amazon customer, you buy someone a gift to be sent directly to them. You've just given away an enormous amount of information about that other, innocent (well, not if they're one of my friends or family) person. Their address, their possible birth date or other significant date, the sort of things they like (or that you think they like) and so on. If Amazon already knows their address it can start building a social network of their friends and family and make inferences about them too.
Why is this harmful rather than delightful?
For one thing, your friends never asked you to hand Amazon their data. There might be all sorts of reasons they don't want that to happen. Even if (perhaps especially if) you don't know what reasons they might have for not wanting Amazon to have this data, you should at least ask them first and not ask or press the issue if they say no. They might have things to fear regardless of whether they have anything to hide. And they might have things to hide.
Second, harm may come from a variety of sources, malignant, benign or indifferent. Couples or families might be hurt if one member receives targeted adverts based on a gift. It's not hard to imagine how trouble might be caused if one member of a couple received the gift of a sex toy in the post. It might also be problematic if the adverts someone were served while browsing were informed by a gift, wanted or otherwise. Inducting someone into a social network operated in secret by people who wish to sell us things is not a kind thing to do.
Third, the companies who buy this data collate lists of people they deem 'vulnerable', by which they mean vulnerable to being sold things they don't want or need. The information you shed about them contributes to aggressive targeting and other borderline con-artistry as well as out-and-out conning by less scrupulous firms.
Fourth, this data will certainly be stolen at some point. Hackers will use this data to do bad things to our friends. They'll steal their identity, which is very much easier if they know trivial facts about people such as where they shop and eat. They'll create digests of information about certain types of people and sell them to bad guys who specialise in screwing that type of person. For example, helpful gifts might indicate that the recipient is elderly. An unscrupulous company might (rightly or wrongly) conclude that the elderly person is especially vulnerable and target them for scams that match the gifts they've received.
Fifth, spam. You're putting people on lists that are sold to spammers - email, real world, knocking on our doors - I don't think anyone wants that.
2. Screwing strangers
There's a very real sense in which customers are becoming less customers and more sheep to be shorn, bags of organs to be harvested. Our gleeful introduction of others into this practice completes the analogy. We're all the Judas Goat for faceless corporations, dragging our friends into dangers they didn't sign up to.
But it's worse even than that because those companies are also screwing their own employees based on our privacy choices.
Here's one of the most obvious examples: you know when you visit a restaurant and they ask you to rate the service on a card or - increasingly - on a touch screen? What on Earth do you think that's for other than to generate an excuse to deprive servers of their tips? A simple scale of dissatisfaction isn't going to help the restaurant improve its business, is it? With the card-based version, companies might be angling to seem caring about customers (while still changing nothing and punishing servers) but with the computerised version, we can be sure that servers will be screwed more. What kind of servers generate the most dissatisfaction? Can companies find ways to incorporate these results into their existing racist or sexist hiring and firing policies? Can they generate brand new racist or sexist policies?
Well of course they fucking can. And will.
But look also at the wider picture. Much of restaurant technology is aimed at either getting people back out through the door as quickly as possible or selling them more stuff. To achieve this they (especially chains) do all kinds of worrying stuff. They greet you by name. They remind you of what you ordered last time you visited (even if it's a different location).
The servers and lower to middle management are easy to punish if this does not go according to plan and customers sit around enjoying their meals instead of hurrying and/or ordering stuff they didn't want.
By gleefully shedding data we turn ourselves into sheep to be shorn. But we turn other people into sheep, too. And we turn the former farmers into serfs, serving at the whim of their owners to achive goals not related to their jobs and punishments that are not based on how well they do their jobs.
That's the harm. Don't make me roll my eyes.