Over the years, we've seen big improvements in the way applications are developed, with new methodologies, development frameworks and testing tools helping to reduce the number of security issues with new systems. Yet, systems continue to be compromised.
A recent exercise at the CanSecWest conference in Vancouver presented the challenge to find a new way of hacking a PC running Vista, a Ubuntu Linux desktop and a MacBook Air.
After the first day's effort, attacking from the network, no machine had been compromised. On day two, the machines were allowed to use the standard applications to visit websites, read emails and other typical client activity.
In just a few minutes, the MacBook was compromised through a new vulnerability in its Safari web browser. On the third day, additional applications were used, and the Vista PC was compromised through a vulnerability in a newly installed copy of Adobe Flash.
The situation is often presented as an arms race between smart hackers and equally clever developers, administrators and security professionals aiming to keep systems secure.
Unfortunately, this picture doesn't hold true. It isn't cleverness that's at the heart of information insecurity.
In a recent routine audit, I came across a service exposed to the internet the vulnerability scanner didn't recognise. It looked to be running a rather interesting plain-text protocol.
Research revealed that it was for a popular cross-platform version control system, not designed for untrusted environments as, by default, it doesn't even use passwords.
With the client being freely downloadable, I quickly gained access to the source code for the firm's key application. It's easy find passwords and account details hard-coded into the source.
Not only does the repository contain source, it also has a copy of the live data. Soon, streams of bank accounts and sort codes are scrolling over my screen.
Maybe this is an exception, but time and time again I come across the same security mistakes. The CanSecWest challenge shows how the nature of vulnerabilities has changed. Many threats target the client.
Whether they are delivered through email or a web browser, they all originate from inside your security perimeter. It also shows the importance of patching the non-default applications as well as the “standard” ones.
This is the most common error, which is found in just about every audit. Even in the best cases there will be key patches missing – especially those non-Microsoft ones.
If you want to find out whether your security has been compromised, and when, you need to check your logs. Of course, the next most common mistake is to forget to enable logging on your devices or to delete the logs after a short amount of time.
When it comes to writing code or installing applications, particularly when you're up against a tight deadline, it's easy to make mistakes. To guard against this, it's important to test.
When doing this, think about the threats and risks. Failure to test can lead to vulnerabilities.
So, despite the prominence of security issues, the same mistakes, from coding to implementation, through to operations, prevail. But the biggest mistake is to ignore the connections between these areas.
Security is viewed not as a process but as a set of discrete functions that can be fulfilled by the acquisition of individual products and services. It's not enough to simply install the latest security device.
There needs to be a continual process of understanding what is happening on the network – which means monitoring logs, understanding vulnerabilities, developing effective approaches to patching, designing secure systems and continually testing and improving.
As the costs of security breaches continue to rise, security needs to become much better. But it isn't the cleverness of the hackers, it's the mistakes of the developers, administrators and operators that are the biggest threat to security.
Ian Castle, CISSP, is a senior consultant at information security consultancy ECSC
See original article on scmagazineus.com
Learn from these mistakes
By Ian Castle on Jun 12, 2008 3:53PM