Tuesday, 15 August 2017

Surveillance self-defense

A good set of resources from EFF about the basics of anti-surveillance protection.

Read them! Send them to your friends and family!  Security and privacy are a joint enterprise.

Friday, 11 August 2017

You can hack gene sequencers by hiding malware in DNA

This is seriously cool.

Today at the Usenix Security conference, a group of University of Washington researchers will present a paper showing how they wrote a piece of malware that attacks common gene-sequencing devices and encoded it into a strand of DNA: gene sequencers that read the malware are corrupted by it, giving control to the attackers.
I sometimes forget that we're living in the 21st century. 

Thursday, 10 August 2017

Amber Rudd breaks the irony meter

The UK Home Secretary, Amber Rudd, is no fan of encryption. She's said that 'real' people don't need  encryption as an argument against secure communications apps such as Whisper System's Private Messaging service Signal, which uses end-to-end (e2e) encryption meaning that not even the operators themselves can intercept their users' messages.

It's ironic, then, that she has fallen victim to a prank which would not have been possible if she - presumably a real person - had used encryption.

The now-notorious email prankster known as Sinon Reborn set up an email address in the name of
Theresa May's communications chief, Robbie Gibb.  Reborn emailed Rudd's parliamentary email address and she replied from a private address.
“I managed to speak to a home secretary with relative ease on her personal email address,” Reborn told the Guardian. “I replied again saying: ‘Don’t you think you should be more aware of cyber security if you are home secretary?’ and I never got a reply from that.”
This ought to be embarrassing for any cabinet member. I'm sure there are numerous guidelines and memos on this, especially as Reborn has pulled the same trick on other high profile figures:
The same hoaxer has tricked the son of the US president, Eric Trump, the next US ambassador to Russia, Jon Huntsman Jr, and the former White House communications chief Anthony Scaramucci, sparking an investigation in Washington into cyber-security. He has also duped the governor of the Bank of England, Mark Carney, and Barclays boss Jes Staley by setting up fake email accounts.
It's especially embarrassing for the minister who is supposed to be in charge of cyber-security.

And even more so given her strong anti-encryption stance. If MPs and government staff used encryption, then Rudd could have verified that the email was really from Gibb.
A Home Office source confirmed that the exchange had taken place, but said Rudd does not use her personal email address to discuss government business. “As the email exchange shows, she rapidly established that this was a hoax and had only exchanged pleasantries up to that point.”
That, of course, is not the point. It was still a security breach and a national embarrassment. That it happened to the minister who is supposed to lead us through an age of rising cyber-crime is also terrifying.

Thursday, 3 August 2017

Not this again

The BBC reports that someone has put a chip in his body to unlock his car. It is not clear why although his evident undeserved smugness is likely reason enough for him. It's also unclear that there's even a very credible security advantage since hacking car locks has so far proved easier than stealing people's keys.


But I'm biased. It reminds me too much of the pointless Kevin Warwick who has for decades been claiming to be a cyborg because he had an RFID chip in his arm. Having an RFID chip 1mm outside your skin in a badge doesn't make you a cyborg but having one 1mm on the other side does, apparently. The distinction without a difference has certainly earned him a lot of stupifyingly dull and stupid column inches over the years.

I've nothing in principle against using implants for authentication and I've no doubt it'll happen in the near future. It'll be convenient, but it won't pay to underestimate the security concerns, or the practical ones, for that matter.

It seems a nice idea, for example, to use an implant for 2FA alongside a physical artifact such as car keys, but then how do you lend your car to someone else or even allow them to unlock it to get stuff out? Perhaps taking care of your keys like, you know, an adult might be a superior solution all round.

We already know that RFID chips in passports etc can be skimmed from a distance. At least we can put our passports in RFID-proof wallets. It's a little less convenient to wear lead gloves.  And besides, how do we deactivate authentication when we know someone has skimmed our implant? How do we upgrade?

The problem is one of poor analogy. Authentication shouldn't be thought of as a key, it should be thought of as (some) proof of who we are. After that, infrastructure needs to decide what we're allowed to do in a given situation.

There are lots of smart people working out how that infrastructure might work, but slitting yourself open and installing an RFID chip is not approaching smart. People are working on how we might delegate authentication in complicated ways and how identity certifiers and authentication services could collaborate without creating a vast security minefield. There is already a fucktonne or so of literature on this subject.

But what's reported is some idiot injecting a chip into himself as though the future has already happened.

Broken encryption not required for policing encryption-using terrorists

Three people who planned terrorist attacks have been caught, tried, convicted and jailed for life. According to the BBC, they called themselves the Three Musketeers "when exchanging encyrpted messages". *GASP* - would-be terrorists using encryption!!!!!!!


But they were caught anyway, government-broken encryption was not required, conventional policing techniques sufficed.

Cory Doctorow's history of the rhetoric of the backdoor wars

Cory Doctorow writes at Boing Boing about the sort of rhetoric The UK Home Secretary Amber
Rudd used last week to justify her proposed ban on workable encryption.

It's pretty much spot on:
Here's a brief history of the rhetoric of the backdoor wars:
* "No one wants crypto, you can tell because none of the platforms are deploying it. If crypto was something normal people cared about, you'd see it in everyone's products. You crypto advocates are weird and out-of-step." (Clipper Chip - San Bernardino)
* "Companies are all using crypto. They are being irresponsible. Sure, everyone wants crypto and adding it to a product helps you sell it, but that's just profiteering while reducing our common security." (San Bernardino - This week)
* "Companies are all using crypto. But no one wants it. The fact that every major platform has rolled out working, end-to-end cryptography tells us nothing about the preferences of their customers. They're wasting their shareholders' money on working security that no one wants, while reducing our common security." (Last week - ??)
Next: some company will cave to Rudd and lose all their business to a competitor with working crypto. Then Rudd will say:
* "Sure, everyone wants working crypto, but you can't always get what you want. Look at Sellout.com, plc: they caved to our demands to eliminate security and got destroyed in the market. We must defend the good corporate stewardship of Sellout.com, plc by punishing their competitors for not joining them in the race to the bottom."
Here's a brief history of the rhetoric of the backdoor wars:
  • "No one wants crypto, you can tell because none of the platforms are deploying it. If crypto was something normal people cared about, you'd see it in everyone's products. You crypto advocates are weird and out-of-step." (Clipper Chip - San Bernardino)
  • "Companies are all using crypto. They are being irresponsible. Sure, everyone wants crypto and adding it to a product helps you sell it, but that's just profiteering while reducing our common security." (San Bernardino - This week)
  • "Companies are all using crypto. But no one wants it. The fact that every major platform has rolled out working, end-to-end cryptography tells us nothing about the preferences of their customers. They're wasting their shareholders' money on working security that no one wants, while reducing our common security." (Last week - ??)
Next: some company will cave to Rudd and lose all their business to a competitor with working crypto. Then Rudd will say:
  • "Sure, everyone wants working crypto, but you can't always get what you want. Look at Sellout.com, plc: they caved to our demands to eliminate security and got destroyed in the market. We must defend the good corporate stewardship of Sellout.com, plc by punishing their competitors for not joining them in the race to the bottom."
Here's a brief history of the rhetoric of the backdoor wars:
* "No one wants crypto, you can tell because none of the platforms are deploying it. If crypto was something normal people cared about, you'd see it in everyone's products. You crypto advocates are weird and out-of-step." (Clipper Chip - San Bernardino)
* "Companies are all using crypto. They are being irresponsible. Sure, everyone wants crypto and adding it to a product helps you sell it, but that's just profiteering while reducing our common security." (San Bernardino - This week)
* "Companies are all using crypto. But no one wants it. The fact that every major platform has rolled out working, end-to-end cryptography tells us nothing about the preferences of their customers. They're wasting their shareholders' money on working security that no one wants, while reducing our common security." (Last week - ??)
Next: some company will cave to Rudd and lose all their business to a competitor with working crypto. Then Rudd will say:
* "Sure, everyone wants working crypto, but you can't always get what you want. Look at Sellout.com, plc: they caved to our demands to eliminate security and got destroyed in the market. We must defend the good corporate stewardship of Sellout.com, plc by punishing their competitors for not joining them in the race to the bottom."
Here's a brief history of the rhetoric of the backdoor wars:
* "No one wants crypto, you can tell because none of the platforms are deploying it. If crypto was something normal people cared about, you'd see it in everyone's products. You crypto advocates are weird and out-of-step." (Clipper Chip - San Bernardino)
* "Companies are all using crypto. They are being irresponsible. Sure, everyone wants crypto and adding it to a product helps you sell it, but that's just profiteering while reducing our common security." (San Bernardino - This week)
* "Companies are all using crypto. But no one wants it. The fact that every major platform has rolled out working, end-to-end cryptography tells us nothing about the preferences of their customers. They're wasting their shareholders' money on working security that no one wants, while reducing our common security." (Last week - ??)
Next: some company will cave to Rudd and lose all their business to a competitor with working crypto. Then Rudd will say:
* "Sure, everyone wants working crypto, but you can't always get what you want. Look at Sellout.com, plc: they caved to our demands to eliminate security and got destroyed in the market. We must defend the good corporate stewardship of Sellout.com, plc by punishing their competitors for not joining them in the race to the bottom."
Here's a brief history of the rhetoric of the backdoor wars:
* "No one wants crypto, you can tell because none of the platforms are deploying it. If crypto was something normal people cared about, you'd see it in everyone's products. You crypto advocates are weird and out-of-step." (Clipper Chip - San Bernardino)
* "Companies are all using crypto. They are being irresponsible. Sure, everyone wants crypto and adding it to a product helps you sell it, but that's just profiteering while reducing our common security." (San Bernardino - This week)
* "Companies are all using crypto. But no one wants it. The fact that every major platform has rolled out working, end-to-end cryptography tells us nothing about the preferences of their customers. They're wasting their shareholders' money on working security that no one wants, while reducing our common security." (Last week - ??)
Next: some company will cave to Rudd and lose all their business to a competitor with working crypto. Then Rudd will say:
* "Sure, everyone wants working crypto, but you can't always get what you want. Look at Sellout.com, plc: they caved to our demands to eliminate security and got destroyed in the market. We must defend the good corporate stewardship of Sellout.com, plc by punishing their competitors for not joining them in the race to the bottom."

Tuesday, 1 August 2017

'Real' people don't need encryption

Unfortunately, our Home Secretary here in the UK is the increasingly deranged Amber Rudd. Amber Rudd wants to break encyrption in the name of security fascism.

She seems to be channelling the Australian Prime Minister, Malcolm Turnbull, who recently said:
The laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia.
Here’s Rudd’s version:
I know some will argue that it’s impossible to have both – that if a system is end-to-end encrypted then it’s impossible ever to access the communication. That might be true in theory. But the reality is different.
Unfortunately, the source is behind a paywall if that’s the sort of thing that slows you down.

She goes on to say that “real” people don’t use encryption:
Real people often prefer ease of use and a multitude of features to perfect, unbreakable security. So this is not about asking the companies to break encryption or create so called “back doors”. Who uses WhatsApp because it is end-to-end encrypted, rather than because it is an incredibly 
user-friendly and cheap way of staying in touch with friends and family? Companies are constantly making trade-offs between security and “usability”, and it is here where our experts believe opportunities may lie.
I’m not sure what “opportunities” she means or why usability is scare-quoted, but there are lots of us who use certain channels because they are e2e encrypted rather than because of how nice they look. We have legitimate reasons for keeping secrets, not least of which are the things Amber Rudd says.

Want to see something even scarier from the same article?
So, there are options. But they rely on mature conversations between the tech companies and Government 
– and they must be confidential.
Let that sink in. Let. It. Sink. In. We won't be privy to the details of whether or how our conversations are to be laid bare to all and sundry. It'll be done and it'll be done in secret.

She finishes thisway:
The key point is that this is not about compromising wider security. It is about working together so we can find a way for our intelligence services, in very specific circumstances, to get more information on what serious criminals and terrorists are doing online.
It might not be about compromising wider security but that’s what it will do. She obviously knows that or she wouldn’t be fielding those objections. She’s lying. She's obviously lying.

What not to do while anonymous

Ineffective security can be worse than no security at all. Being lulled into a false sense of security can cause us to engage in risky behaviours. This is true of anonymous browsing technologies such as Tor.  As the Tor Project site takes pains to tell us, Tor is by no means a panacea. We need to avoid certain behaviours to remain anonymous online even if we're using anonymisation technology.

Hiding our IP address and encrypting our traffic is not enough to remain anonymous. As the Tor Project puts it:
Also, to protect your anonymity, be smart. Don't provide your name or other revealing information in web forms. Be aware that, like all anonymizing networks that are fast enough for web browsing, Tor does not provide protection against end-to-end timing attacks: If your attacker can watch the traffic coming out of your computer, and also the traffic arriving at your chosen destination, he can use statistical analysis to discover that they are part of the same circuit.
Whonix is more specific on its Do Not page. Note: you should definitely check out Whonix if you are interested in online anonymity.

Here's their index of things not to do while trying to be anonymous.  All excellent advice, as you'd expect.
Things NOT to Do

    Visit your Own Website when Anonymous
    Login to Social Networks Accounts and Think you are Anonymous
    Never Login to Accounts Used without Tor
    Do not Login to Banking or Online Payment Accounts
    Do not Switch Between Tor and Open Wi-Fi
    Prevent Tor over Tor Scenarios
    Do not Send Sensitive Data without End-to-end Encryption
    Do not Disclose Identifying Data Online
    Do Use Bridges if Tor is Deemed Dangerous or Suspicious in your Location
    Do not Maintain Long-term Identities
    Do not Use Different Online Identities at the Same Time
    Do not Login to Twitter, Facebook, Google etc. Longer than Necessary
    Do not Mix Anonymity Modes
        Mode 1: Anonymous User; Any Recipient
        Mode 2: User Knows Recipient; Both Use Tor
        Mode 3: User Non-anonymous and Using Tor; Any Recipient
        Mode 4: User Non-anonymous; Any Recipient
        Conclusion
        License
    Do not Change Settings if the Consequences are Unknown
    Do not Use Clearnet and Tor at the Same Time
    Do not Connect to a Server Anonymously and Non-anonymously at the Same Time
    Do not Confuse Anonymity with Pseudonymity
    Do not Spread your Own Link First
    Do not Open Random Files or Links
    Do not Use (Mobile) Phone Verification
This is just the index. Visit the page to see why these are all bad ideas.

There are things you can do to help projects like this.

You can donate to Tor and/or Whonix. You can run a Tor relay. You can campaign and advocate for privacy and you can harangue your government representatives. You can support the Open Rights Group and the Electronic Frontier Foundation. And you can educate your loved (or hated) ones.

Monday, 31 July 2017

How we screw friends, families and strangers by being careless

If you know me, you'll know that my eyes are constantly aching from rolling at the phrase "if you've nothing to hide, you've nothing to fear."  One of the guiding principles and main motive force of privacy work is that most people believe this.  It's the fact that most people believe it that makes the accidental or deliberate shedding of personal data so valuable.

I won't go into the reasons the phrase is wrong here, largely because I have learned through experience that if I start, I'm unlikely ever to stop.  But I will go into some of the reasons why it's difficult to convince people that their privacy matters and then give some suggestions about how to do it.

It's easy enough to understand why people use that damnable phrase. I don't actually blame anyone for believing it, although my aching eye muscles could do with a break for at least a few minutes every day.  The scale and malevolence of how we're all being screwed by the people who have our data is - deliberately - largely hidden.  We're not told who our data will be sold to or what it will be used for. Fine print tells us that it "might" (or the even more insidious "may") be 'shared' with 'partners' but without any indication of exactly what data is being shared with whom or why.  We have a tendency to think that if people aren't telling us things in capital letters then the things are probably not very important.

It's also hard to connect the consequences of sharing data with any negative outcome. It's unlikely that we'll ever connect an instance of identity theft with the box we ticked on a website nine years ago, for example. Plus, of course, we often give away data to get cool stuff (10th sub free! apps that anticipate our needs etc) and we don't want to give that up, especially since we don't always understand why giving away that data might be bad. Nor should we, necessarily. The benefits might indeed outweigh the harm for some people in some cases. The problem is that we're not equipped to make that decision, because of the deliberate machinations of the companies who make money from our data and the complexity of the landscape.

So people like me need to come up with convincing examples. How can ticking this particular box harm you in the future?  This is hard, not because examples don't exist but because they have to cover that distance in time and place between the ticking of the box and the stealing of the identity. They have to show that it's in aggregation of data over a period of years that the greatest danger lies. We humans are not very good at internalising knowledge of that sort or at practising the regimen needed to do anything about it.

The examples I've had the most success with tend to be ones that show how poor privacy habits can screw our friends and family. I find this confusing - I hate my friends and family - but it seems to work for lots of people.

It's a very important point and one I harp on enough to contribute to the eye-rolling muscle strain of my friends and family (good - I told you I hate them): privacy is a group exercise. It would be good if we tried not to inadvertently screw each other the whole time through our own carelessness.

There are ways we can screw the people in our own networks through complacency and other ways we can screw complete strangers.  Please don't take this as a manual of how to screw people, by the way, treat it instead as a way to be mindful of how our actions can harm others whether we mean to  or not.

1. Harming friends
I frequently talk about the Amazon gift service because it is such a perfect example. You're an Amazon customer, you buy someone a gift to be sent directly to them. You've just given away an enormous amount of information about that other, innocent (well, not if they're one of my friends or family) person. Their address, their possible birth date or other significant date, the sort of things they like (or that you think they like) and so on. If Amazon already knows their address it can start building a social network of their friends and family and make inferences about them too. 

Why is this harmful rather than delightful? 

For one thing, your friends never asked you to hand Amazon their data. There might be all sorts of reasons they don't want that to happen. Even if (perhaps especially if) you don't know what reasons they might have for not wanting Amazon to have this data, you should at least ask them first and not ask or press the issue if they say no. They might have things to fear regardless of whether they have anything to hide. And they might have things to hide.

Second, harm may come from a variety of sources, malignant, benign or indifferent.  Couples or families might be hurt if one member receives targeted adverts based on a gift. It's not hard to imagine how trouble might be caused if one member of a couple received the gift of a sex toy in the post. It might also be problematic if the adverts someone were served while browsing were informed by a gift, wanted or otherwise. Inducting someone into a social network operated in secret by people who wish to sell us things is not a kind thing to do.

Third, the companies who buy this data collate lists of people they deem 'vulnerable', by which they mean vulnerable to being sold things they don't want or need. The information you shed about them contributes to aggressive targeting and other borderline con-artistry as well as out-and-out conning by less scrupulous firms.

Fourth, this data will certainly be stolen at some point. Hackers will use this data to do bad things to our friends. They'll steal their identity, which is very much easier if they know trivial facts about people such as where they shop and eat. They'll create digests of information about certain types of people and sell them to bad guys who specialise in screwing that type of person. For example, helpful gifts might indicate that the recipient is elderly. An unscrupulous company might (rightly or wrongly) conclude that the elderly person is especially vulnerable and target them for scams that match the gifts they've received.

Fifth, spam. You're putting people on lists that are sold to spammers - email, real world, knocking on our doors - I don't think anyone wants that.

2. Screwing strangers
There's a very real sense in which customers are becoming less customers and more sheep to be shorn, bags of organs to be harvested. Our gleeful introduction of others into this practice completes the analogy.  We're all the Judas Goat for faceless corporations, dragging our friends into dangers they didn't sign up to.

But it's worse even than that because those companies are also screwing their own employees based on our privacy choices.

Here's one of the most obvious examples: you know when you visit a restaurant and they ask you to rate the service on a card or - increasingly - on a touch screen? What on Earth do you think that's for other than to generate an excuse to deprive servers of their tips? A simple scale of dissatisfaction isn't going to help the restaurant improve its business, is it? With the card-based version, companies might be angling to seem caring about customers (while still changing nothing and punishing servers) but with the computerised version, we can be sure that servers will be screwed more. What kind of servers generate the most dissatisfaction?  Can companies find ways to incorporate these results into their existing racist or sexist hiring and firing policies? Can they generate brand new racist or sexist policies?

Well of course they fucking can. And will.

But look also at the wider picture. Much of restaurant technology is aimed at either getting people back out through the door as quickly as possible or selling them more stuff. To achieve this they (especially chains) do all kinds of worrying stuff. They greet you by name. They remind you of what you ordered last time you visited (even if it's a different location). 

The servers and lower to middle management are easy to punish if this does not go according to plan and customers sit around enjoying their meals instead of hurrying and/or ordering stuff they didn't want.

By gleefully shedding data we turn ourselves into sheep to be shorn. But we turn other people into sheep, too. And we turn the former farmers into serfs, serving at the whim of their owners to achive goals not related to their jobs and punishments that are not based on how well they do their jobs.

That's the harm. Don't make me roll my eyes.

A terrible idea

http://www.bbc.co.uk/news/av/technology-40676084/how-facial-recognition-could-replace-train-tickets

The URL says most of what you need to know.

Wednesday, 26 July 2017

Tuesday, 25 July 2017

Some solid advice

https://media.boingboing.net/wp-content/uploads/2017/07/upsstore_100729948_medium.jpgCeaser's Palace in Las Vegas is holding this year's Defcon, a conference about hacking and security.  There are good reasons to believe that scoundrels will be attempting to hack everything in sight and even better reasons to believe they have the skills to pull it off.

For this reason, the UPS business centre in the hotel has decided only to accept print jobs that come as an email attachment, not on a USB stick or via a link. This is a reasonable precaution and probably the best compromise they can make while still doing business. Email attachments aren't at all safe either, of course, but people will need to print stuff, I guess. In general, reducing the number of attack vectors is worthwhile but at a conference like this it might just goad people into getting creative...

Cory Doctorow reports at Boing Boing (from where I borrowed the photo for this post), also noting that Andy Thompson (aka @R41nM4kr) has offered a list of security essentials for attendees.  They are pretty sensible. I follow an almost identical list of rules whenever I am forced to leave the house.

Here's the part of Thompson's list concerned with internet access and connectivity:
  1. Unless absolutely necessary for a job function, disable WiFi.
  2. Disable Bluetooth on your computer and phone.
  3. Disable NFS connectivity on your phone and computer.
  4. If Wifi is absolutely required, ONLY use your own provided wifi. I used a JetBack/MiFi and connect ONLY to that device.
  5. Always use a VPN as soon as you obtain WiFi access.
  6. Do NOT plug any network cable into the laptop.
  7. Do not plug any USB storage devices (hard drives, sticks, network adapters, Raspberry Pi’s, etc) into the laptop or phone. 
The importance of not connecting to public WiFi unless you really need to and then only doing so over a VPN cannot be overstated. I'd love to know more about the pscyhology behind the willingness we have to connect to random networks just because they happen to be there. We generally have no idea about whether they are secure, whether they have been compromised or whether the operators have malicious intent. We don't even know if the network is legit: we tend to assume that if there's a WiFi signal with the same name as the venue, then it's operated by that venue.

It's frighteningly easy to intercept traffic on unencrypted wireless networks. It's almost as easy to write scripts to scan for things that might be passwords flying about the place.  So if you do need to use public or commercial WiFi, be sure to use a VPN.

I use my phone as a mobile hotspot with a VPN rather than use other people's WiFi.  I only make an exception when there's no mobile signal. Something tells me this won't be a problem in Vegas.

My list, if I happen to be leaving the country (especially to the US) has some additions:
  1. Log out of social media, email and messaging accounts on your laptop and phone. Remove any cookies that store passwords.
  2. Use a hardware token (I use a Yubikey Neo) to protect access to your password manager (you're using a password manager, right?)
  3. Send the hardware token in your checked luggage, don't carry it with you.
That way, nobody can force you to reveal your passwords. Of course, they might refuse you entry to the country and it will be quite inconvenient when your luggage is inevitably lost, but if these prices seem like they are worth paying, go for it. Also, you'll feel kind of like a spy.

Monday, 24 July 2017

Age verification

The UK government is threatening to implement age verification on porn sites because won't
someone think of the children. This means that porn site users will have to prove they are 18 before they can feast upon the porn within.

I have to admit, I have some concerns about porn which can be summarised as:
  1. Lots of performers (especially women) are hurt by the porn industry. There are questions of whether consent is really possible when one's income relies on saying yes. Sex work is not necessarily just another job and there are certainly porn companies that take advantage of performers and their plight, if they have one. I have nothing at all against consensual performance and am entirely in favour of sex workers being allowed to work without criticism or harassment. But we usually have no way of telling what pressures the performers face and therefore what, if any, consent, they are really capable of. I think we - as consumers of porn - need to be very careful.
  2. The messages children are likely to glean from porn are not positive. They could be, I reckon. Hangups about sex and sexuality from previous generations and religious nonsense are terrible things and being positive and cool and non-judgmental about sex and sexuality is surely good. But it's clear that the vast majority of porn doesn't encapsulate great messages about agency and consent and equality. If a child's introduction to sex is mainstream porn, it seems likely that it'll have fucked up ideas about how to treat other people, especially women. I would  rather they learn sex-positive lessens from places other than porn.
The second item is most germane to the government's goal of age verification of porn sites but there are some problems. I'll stick with two:
  1. Literally everyone on the planet knows it won't work. It's the equivalent of - in the 70s and 80s - putting porn on the top shelves of news agents where children cannot supposedly reach with their short arms. It's like the buying of booze and tobacco by children through very easy means such as asking someone older. Refusing to sell young people cigarette papers probably won't cause an indelible barrier to their smoking a bit of whatever takes their fancy.
  2. Putting porn verification in the hands of the people who sell the porn is an open season for blackmail.
And this is the thing. Make up your own mind about porn: it's not illegal to make it (for the most part) or consume it (usually) in the UK. But if we have to register our consumption of porn, we're at the mercy of laws that will certainly change for the worse. 

Being a registered porn consumer will automatically put you in the frame for sex crimes, for example, regardless of any other suspicion. The register of casual porn users will become a list of automatic suspects. 

And porn companies, who have our credit card details, would be in an excellent position to threaten us, fake our browsing or chat behaviour or otherwise fuck us over.

And of course that's all before worrying about how the whole registration and access business might work, which is nightmarish in itself just from an engineering perspective.

TL;DR: It's complicated. Age verification won't protect anyone and it'll certainly expose people who haven't done anything wrong to undue and improper security. 

And above all, it won't protect the people who need the most protection: the performers.


Wednesday, 19 July 2017

Evil Thoughts: be the fox in your own hen house

https://s-media-cache-ak0.pinimg.com/736x/c2/e1/99/c2e199a8d5bc3ac170e6c0455788e9ff.jpgA lot of people feel that we only really wake up to security when we’re stung by an attack. I’m not sure this is true. For example, we might learn less about security when our house is burgled than we do when we lock ourselves out. We always manage to get back in eventually, after all. We might find inventive ways to gain entry or call a locksmith who will have the door open in about five seconds. Either way, we learn something about our house’s vulnerabilities and how secure it really is.

We might remember that one slightly dodgy window latch we’ve been meaning to fix and wonder if we might be able to wiggle it open from outside. We might use an improvised device to see if we can open a door from the inside through the letterbox. We wonder whether we could use that rock in the garden to smash a window. We worry about setting off the alarm, but then remember that nobody takes any notice of alarms anyway.

Whatever – and regardless of whether we succeed – we’ve suddenly thought a lot more about home security than we ever have before. In contrast, when we’re burgled we tend to assume that the burglars have secret knowledge or skills because, well, that’s what burglars do. We expect burglars to be able to gain entry if they try long and hard enough, but we assume this is because of their ninja skills, not because our houses are all fundamentally insecure.

It’s only when we try to break in ourselves that we realise the truth.

This is why penetration testing (aka pen testing) exists. A pen test is an authorised attack on a system designed to expose its vulnerabilities so that they can be fixed. It’s the equivalent of the desperate householder trying to break in to their own home. There are many pen testing specialists out there and the field seems to be growing. This is because to take security seriously, you must see the system from outside and tech companies are increasingly recognising this.

This is also true of our own personal systems: our networks of computers, tablets, phones, ebook readers, digital assistants, smart devices, connected lightbulbs, software, services (such as social networking, online purchasing etc) and – importantly – our friends and family. We need to think about those things as if we were trying to gain elicit access to our own stuff if we are to protect our privacy and safety.

A trivial example: we might not feel a need to lock our computers when we leave the house, because the house itself is locked and anyway, it’s annoying to have to type in passwords every time the screen locks. But we’ve just seen how easy it is to break into a house. It’s not unreasonable to expect that – increasingly – burglars will enter our homes to gain access to our devices for the information they contain as much as for the value of the hardware. Leaving aside for now the standard (and incorrect) defence that “there’s nothing interesting on my devices anyway” (which I’ll talk about a lot more in weeks to come), our devices are very useful to people with ill intent. They might not have any particular grudge against us, but might use the data on our devices to steal our identities, creating new credit accounts in our name, spending the contents and saddling us with the debt and the damage to our credit ratings.

We need to think about the things a bad guy might do if they had physical access to our devices and implement safeguards which will stop them doing harm or at least make it too difficult to bother. We need to think like the burglar rather than like the complacent homeowner.

A more complicated example: a security setup is only as good as its weakest link. Sometimes the weakest link is a person or our relationship with that person. Our friends and family might be leaking information about us that could be useful to an attacker. Which means, of course, that we are probably doing the same to them. Here is one way we can weaken other people’s security without necessarily knowing it:

When we use Amazon to buy a gift for someone (to be sent to them directly), we’re telling Amazon an awful lot about that person. We’re telling Amazon that they are associated with us in some way, that perhaps it is their birthday or anniversary, the kind of things they like (or at least the things we think they like) and so on. If our friend also has an Amazon account – which is very likely – then Amazon will know even more. It will know about the people they buy gifts for, the other people you buy gifts for and might be able to track which of these other people also buy gifts for each other. They’ll be able to infer how good we all are at gift buying, based on the differences between what we buy for other people and what they buy for themselves. They can infer the strength or quality of relationships based on the money we all spend on each other and even on how late we leave it before ordering something, whether we look at their wish lists and so on. We’ve given away a lot of potentially exploitable information about people who didn’t give us permission to do so and probably don’t know that it has happened. And chances are they’re doing the same to us.

All this information could be available to criminals whenever Amazon is hacked, which will certainly happen quite often.

This is why we need to think like burglars rather than householders. We need to act like we’re locked out and have to find interesting ways to get back in through improvised means. We need to be the fox in our own hen-houses.

But while I think this is sound advice, it isn’t very practical yet. I’ll get around to more practical advice in the coming weeks. In the meantime, here is an example to get you thinking about the criminal mindset you’ll need to keep you and your friends safe.

When you last changed a password because you forgot the old one, did you do something like open a new message in your email client to temporarily store it before you could memorise it or store it somewhere more secure (I’ve seen people do this)? Do you know whether the email client saved that message as a draft? Draft emails are often a rich source of useful information, partly because we all tend to forget they exist.

Be sneaky! Tell me about your sneaky ideas in the comments.

Tuesday, 18 July 2017

DRM needs to protect people other than the rights holders

I'm all for people being able to protect the content they've created from being abused, but DRM
(Digital Rights Management) is frequently used for less noble purposes.  I'll go into this in this week's Wednesday post.

The World Wide Web Consortium (W3C) and partcularly its director Tim Berners-Lee (yes, that Tim Berners-Lee) recently decided to ignore numerous objections by W3C members and the internet-using public in general to go ahead with its plan to incorporate DRM into the web's body of standards.

There are numerous problems which I'll talk about tomorrow (or you can read the text of the EFF's apppeal against the decision here).  For now, read the EFF's appeal to get a sense of who and what we're fighting.

Creators deserve protection but publishers shouldn't get to decide how consumers use the content they've bought or how resarchers investigate the security of DRM systems or which innovations are allowed to succeed. This is the battleground. I'll write more about it tomorrow.

Breaking encryption

Breaking encryption is a really bad idea. There's no such thing as a back door that 'good' people (such as governments) can use and bad people such as criminals can not. This doesn't prevent virtually every government from pledging to force technology companies to implement encryption back doors in the false name of security against terrorist attacks. This won't work because terrorists do not have much incentive to obey the law. This rather reminds me of the little green visa forms you had to fill in when flying to the US. You had to tick a box to say you hadn't committed any genocides as though lying on the visa form was the greater offense.

Australia's government is the latest to adopt this pre-beaten dead horse of a stupid idea. They're copying the UK, which makes me feel guilty. I feel I must apologise for the conduct of our nation. Sorry, Australia.

The article I quoted goes over the usual stuff but I found the following amusing (emphasis mine):
But some experts, as well as Facebook, warned that weakening end-to-end encryption services so that police could eavesdrop would leave communications vulnerable to hackers.
The quote from Australian Prime Minister Malcolm Turnbull is exactly as terrifying as it is hilarious:
The laws of mathematics are very commendable but the only law that applies in Australia is the law of Australia.
I look forward to the anti-gravity bill. 

Relaunch

https://www.etsy.com/uk/listing/189665986/gothic-art-tattooed-tattoo-evil
Absolutely not what this blog was named after.
Welcome to the relaunched evilwednesday. There will be a few changes around here.

The biggest change is that I'll be posting more often and more briefly. I'll try to limit myself to a few sentences on each post unless I don't. It is my blog :) I'll also write some commentary every Wednesday about various topical things. That's generally where I'll be more expansive.

As always the topics will relate to privacy and open rights. Other topics I'm interested in such as human rights, social justice, atheism, skepticism, cats will appear on another blog (URL to follow when I've decided where to put it) and those posts will be cross-linked here without comment so you can more easily ignore them.

Finally, I'll be trying to publicise this content more widely and generate some interest in privacy/open rights activism.

If you have anything to contribute, the comments are your playground.