Reading much of the popular press on the subject of last year's HMRC data breach, you'd think that careless handling of personal data was purely a government problem. In fact, a review of incidents that were made public during 2007 tells a different story, with banks, building societies and mobile-phone companies all sheepishly admitting to mishandling a range of sensitive data.
The first-order reaction of the security business to this sort of thing is a predictable sales pitch; news interviews with relevant vendors and demands for wholesale encryption and improved security. I dread to think how many reporters will be at Infosec this year, and how much data will get lost due to poorly thought-out rollouts of full-disk encryption.
While I have long been a self-confessed crypto geek, simply blanketing the market with encrypted laptops, PDAs and CDs will not solve the overall problem. Certainly the risks would be reduced if the lost or stolen devices were protected by strong encryption, but I think it's far more important to look at the root causes to really make a difference. Perhaps surprisingly, these are arguably more psychological than technical.
For example, the HMRC breach involved the personal details of every family in the UK receiving child benefits. It's quite likely that the personal data of the employee who posted the discs, and/or that of some of their friends and family, was actually on them. It is therefore unlikely that they consciously decided to be careless with the data.
A more likely explanation is that they didn't consciously think of the data as sensitive; it was, after all, just another database, and they deal with such data every day. There's plenty of anecdotal evidence that those with regular access to highly sensitive data often become somewhat blase about how they look after it, and are particularly disparaging of the risk regarding lower valued information. Regular exposure seems to promote immunity to concern about risks.
The same phenomenon of gradual acclimatisation to, and acceptance of, increasing risk is well documented in safety systems worldwide. NASA in particular has been guilty of this, with space shuttle disasters being clearly linked to a "culture of risk" within the organisation. In the armed forces, similar problems are behind many cases of collateral damage against friendly forces.
Over a period of time, the level of acceptable risk gradually climbs, until those that used to seem foolhardy become commonplace and accepted. This is a systems failure rather than a personal one.
In security we see this all the time. How many times have you heard the phrase "nobody would be interested in this"? Or "that's a bit over the top"? Even in the face of clear technical evidence of vulnerabilities, many people simply choose to ignore them on the basis that it "will never happen to us". I'm sorry to say that some of the worst offenders are in the computer business, as security is just one of those things that gets in the way. Ignorance is one thing, but conscious denial is a much greater sin.
Fixing the technical risk by encrypting the laptop, for example, will mitigate the danger of loss or theft. But if the owner of the laptop still treats the sensitive data without due care, the overall risk is still present. What are the chances that the bag with the encrypted laptop will also contain printed copies of some of the same data? Sure, the volume of data will be less, but in many cases that will not be much consolation.
Likewise with removable media. Sure, if the HMRC discs had been encrypted we'd all be a lot happier. But if large volumes of printed material are treated with similar carelessness we still have a problem.
Once again, there's a need to carefully balance the technical measures with procedural and educational precautions. Most talk at the moment looks solely to the technical issues, which are arguably easier to fix. Sensitive information must be adequately protected regardless of how or where it is stored. To achieve this we need to ensure that the users receive security upgrades, not just their laptops.
- Nick Barron is a security consultant. He can be contacted at email@example.com.
Risk is a state of mind
By Nick Barron on Mar 20, 2008 2:35PM