The effectiveness of traditional anti-virus and encryption systems is failing, according to a panel of experts at the RSA 2010 conference in San Francisco
Currently, successful detection rates for popular anti-virus packages were between 70-90 per cent of all samples, said Ed Skoudis, co-founder of security consultants InGuardians.
While he said increased use of heuristics and behavioural monitoring may be helping overall, the outlook is not rosy, but companies should still use such systems.
“Don't throw the baby out with the bathwater,” he said.
“It helps to cut down on some on the clutter but we can't have just that as a single point of protection any more.”
The best defence from an IT administrator's perspective was vigilance, he said. In particular, admins should check outbound web proxies for either excessive activity or suspicious connection points. Logs of DNS resolution failures were also a useful source, he said.
Companies should be segmenting their networks with internal firewalls to minimise the effect of any outbreaks, he warned.
Dr. Johannes Ullrich, chief technology officer of the Internet Storm Center SANS, agreed, saying that network segmentation was vital to effective security.
Dr. Ulrich also warned against over-reliance on encryption and, in particular, SSL.
“Encryption is becoming a losing battle. You have to overbuild on encryption now - pick the strongest cypher you can and then re-encrypt with another package.
He said that there were increasing problems with SSL as an encryption tool. Man-in-the-middle attacks, where a hacker oversees an encrypted traffic stream, disrupts the connection, and then reconnects with one side to harvest data, are becoming increasingly commonplace.
Ultimately, the best protection for a network, Skoudis concluded, were the IT administrators themselves.
“Stop looking for the silver bullet, there isn't one, it won't happen,” he said.
“There is no silver bullet, except possibly you. Build your skills up. What if we're the defence we've all been waiting for?”