In 2007, South Warwickshire General Hospitals NHS Trust decided to let some staff share smartcards with each other to access patient records. They did this for what sounds like a good reason; logins were taking too long (especially in A&E) and sharing smartcards meant that they could treat emergency cases more quickly. They must have been serious; the move was in breach of the NPfIT security policy.
I don’t know the full extent of the repercussions for privacy but not knowing which doctors accessed someone’s medical records hardly seems like it would end well.
This is a common problem with security systems; they don’t take account of how people will develop (often quite elaborate) behaviour to defeat some operational problem caused by security. They’ll teach the behaviour to new employees and usually they won’t think to tell managers of their brilliant innovation. I have an example:
Years ago I worked for what in those days we called an e-commerce firm. We found that some inconsistencies were finding their way into the database and showing up in the application. Products were showing up as being in stock when they weren’t. The database was ridiculously over-complicated and the software was worse, so it took me weeks to pour through it all and I came up with nothing. I was talking about my frustration over lunch with a colleague in the data entry department and she happened to mention how pleased she was with the new database update tool.
“Er…..w-what new tool?”
It turned out that someone in data entry had complained that the tools they had were no good and the new tools were about a year behind schedule so they’d asked a developer to write them a quick hack to fix a particular problem. It was a half hour job so the developer didn’t think to mention it to anyone. It allowed data entry people to inject SQL into the live server and none of them were trained in SQL. Fortunately, to my knowledge nobody ever used that e-commerce system, but it cost some weeks of my time and caused me to shout quite a lot at the developer.
The data-entry people weren’t at fault. The development process wasn’t at fault. The developer certainly was at fault but he wasn’t disciplined other than my shouting at him. Maybe I was a bit at fault because I should have known that he wrote the tool, but it didn’t go through the CVS so I didn’t know about it. Maybe I should have been better at training…. But that’s the point; it’s easy to apportion blame after the fact.
In the Warwickshire hospital case, everyone was complicit up to and including the trust management. Were they at fault? Or were the software developers at fault? Should they have insisted that the system be tested in a live environment? Or that they talk to and/or observe doctors in A&E before designing the system? Yeah, they certainly should have done that, but what if they weren’t allowed? Doctors’ time is valuable and maybe the analysts only got to talk to managers. Maybe the system went live without proper testing due to some deadlines that were out of the developers’ control. Maybe goalposts kept moving and the client wouldn’t accept revised schedules….
And that’s the point. Systems – and especially security systems – are complicated. They are complicated further by the environments in which they have to be built. There’s no such thing as a security system that doesn’t have an unrealistic deadline or conflicts of interest or obstinate people or ignorant people. Even if there were, something fundamental and probably unnoticed would have changed between agreeing the spec and delivering the solution.
So security very often gets in the way without any obvious benefit to the core business of an organisation, the people who work there or its customers.
But blame is often hard to assign and lessons are difficult to learn. Developers can’t say “we won’t do that again” because they will. They’ll have to. And clients will probably never understand the realities of software development and security because they think they don’t have to.
Life will always find a way to flummox systems if there’s a local reason, even if it’s a nett loss to everyone. It’s a kind of tragedy of the commons.