The room was full of federal officers, police, detectives and forensic examiners, mostly from U.S. agencies but with a couple of foreign police forces represented. Oh, and one journalist, fielding suspicious glances from his classmates.
The interesting coursework aside, just as fascinating was listening to the war stories of the others present. The impression I was left with was that the technology is less and less relevant; it's about policies, people and abuse. People focusing on the role of technology in federal processes are fighting the wrong battles.
One of the attendees in particular caught my attention: a forensic examiner attached to a police force, working in the computer crimes division. He'd moved across from the Crimes Against Children unit, and said that the move made a lot of sense since his unit had been providing 80 percent of the computers being examined by the computer forensics division. Others had similar backgrounds, and noted analogous incidents.
It's possible that there are relatively few corporate cases in their labs because companies simply aren't keen on reporting crimes, much less turning over desktop machines to be disassembled by police, but the conclusion seems inevitably that the isolated cases of child pornography reported in the media are only the tip of the iceberg. The numbers of cases are growing, but partly that's because the investigators are getting better at infiltrating the groups and pursuing investigations.
Under these circumstances, I'm glad that forensic professionals are able to do an effective job of examining seized data. On the other hand, that perspective can sit uneasily with users who argue in favor of protecting privacy, liberating encryption, and generally giving the finger to federal snooping. These users are generally the ones who take steps to make intelligence-gathering difficult, which may be a valid form of protest, but tends to miss the point.
Many of the open source community in particular are quick to put down forensic techniques. They point out how easy it is to conceal information if you try hard enough - encrypted file systems, steganography, Internet anonymizers and so on. And that's true, to a point. There's also the argument that giving agencies (or corporate admins for that matter) sweeping powers to seize computers and conduct forensic analyses is inviting abuse, an argument with which I sympathize to some extent.
However, I'm not sure that the conflict is relevant. Forensic analysis usually happens when a case has already been made - it's evidence, and court orders and search warrants are all part of the process of gathering evidence. And most of that gathering process is fairly run of the mill. Taking hard disks apart and running the platters under spectroscopy, or brute-forcing encrypted files, wasn't low on the agenda, it wasn't on it at all.
Certainly it was surprising to me just how uncomplicated most of the forensic analysis conducted by these agents was. Most of the information the examiners are looking for tends to be out in plain sight (or near enough - a deleted file is easily located on Windows partitions). The challenges tend to be of scale (examining many gigabytes of data for clues is very time consuming) and correlation (identifying trends and tracking criminal associates) than technical.
Which makes me wonder about the paranoid black-helicopter brigade of privacy advocates. By all means resist efforts to undermine your privacy, and encrypt sensitive data. Fight government initiatives to unjustly seize and analyze your drives. But face it: if there's a search warrant and court order giving the examiners the right to put your system under the microscope, claiming to have lost the keys to your encrypted volume is not helping your credibility. A little cooperation would help the process along. Just because there's technology out there that will make fed's life a misery doesn't mean you are compelled to use it, no matter how you feel about the authorities. That's focusing on the technology, and ignoring the bigger picture.
The same applies in corporate environments, though that gets trickier because whereas government agencies are supposed to be accountable to due process, corporate admins can get away with murder. Sure, they're accountable if they're caught, but who's looking over your CSO's shoulder?
But even Guidance's network-capable forensic tool, which gives admins the ability to conduct those same examinations on a remote corporate PC, in real time and without alerting the user, doesn't concern me unduly. The potentials for abuse are there, but that's a policy issue, not a technical one. An abusive admin will just find other ways to abuse his position. Being paranoid and trying to bypass inspections is not usually a winning position to adopt.
To forensic examiners, paranoid individual users aren't of much concern anyway. The targets are technically illiterate criminals, and corporate users who break with compliance requirements. And you'd be hard-pressed to find a CSO who's keen on the idea of a user with a Linux system with a StegFS partition under his desk. You may just be setting yourself up as a target, and this is not a comfortable time to be seen as such.
The bottom line is simple. Sure, there's technology that will make forensic retrieval of your private data very difficult. But there are few instances where you'll get away with using it at all, never mind without arousing suspicion. Instead of fighting to make the system a pain for everyone concerned, focus on the policies which permit abuse, and work to have them repealed or mitigated, rather than pouring efforts into isolated guerrilla tactics.
Jon Tullett is U.K. and features editor for SC Magazine (www.scmagazine.com).