Top 10 IT myths

By

Page 2 of 2  |  Single page

5. Apple/Linux code is more secure than Windows

Top 10 IT myths

Iain Thomson: As I've said before there is nothing inherently superior from a security standpoint between Windows, Linux and Apple's operating system. All contain flaws, it's a question of how they are examined and treated.

Apple's operating system is a masterful piece of work but from a security standpoint it does have exploitable flaws. The only thing is malware writers aren't really interested in Apple software. They are in it for the profit, and so write code for 90 per cent of computers rather than 10 per cent. If Apple and Microsoft's market positions were reversed I sometimes wonder if I would get the same volume of hate mail from Microsoft fanboys telling me I'm a stooge in the pocket of Steve Jobs' marketing machine.

Linux too is not inherently totally secure, as any number of hacks have shown. The ace in Linux's deck is that it has an army of fans who constantly check and recheck the code, and who fix faults as soon as they arise. Commercial companies will never be able to match this tech support resource, either in size or dedication.

Meanwhile, as I've mentioned, Windows is the target everyone is pushing to beat. It doesn't help that Microsoft software is not exactly the best in the world - it's good enough to do the job and that's it. That sums up Microsoft's business strategy in a nutshell.

Add in the fact that because it's so ubiquitous the company is hyper-sensitive about issuing dodgy patches. If a duff patch takes down servers then Microsoft's business clients - its core customers - scream loud enough to wake the dead.

Shaun Nichols: As the largest target, Microsoft understandably gets the most press for its vulnerabilities. The fact that the company issues patches far more often than others doesn't help either. But one look at the massive patches Apple issues every few months should tell you that OS X isn't really bulletproof either.

Really, the argument is rather pointless when it comes to overall security. A well patched system with basic antivirus protection should be more than secure enough in the right hands, regardless of who wrote the OS. Likewise, a foolish user lacking in common sense can get just about any machine infected with malware.

This brings up one of the more under-discussed parts of IT security; the so-called 'meatware' problem. All the security software and proper application coding in the world won't help much when a user wants to install software that is hiding a trojan. It's why social engineering tricks such as fake codecs and phony antivirus scans remain by far the most popular ways of delivering malware.

Personally, I'll take a well-patched system and some common sense over heavy-duty protection and smug naivety any day.

4. AI is just a fast computer away

Shaun Nichols: Perhaps a creation of science fiction movies, many seem to believe that the creation of artificial intelligence is simply a matter of transistors and that once a fast enough computer is put together, true artificial intelligence will soon follow.

This, of course, is not really true. While the computing speed of the human brain is still much, much faster than that of your average desktop, there is still much we don't know about psychology. As a result, there's still a ton of theory that needs to be worked out on the nature of intelligence and the function of the brain before scientists can even start to construct a true artificial intelligence on any sort of computer hardware.

So rest easy, folks. We've still got a while before Proteus asks to be let out of the box.

Iain Thomson: The point when computers overtake humans in intelligence terms, known as 'The Singularity', has long been predicted but I have my doubts.

That we will have computers with the same number of synapses as the brain is not in doubt, probably in about 2030 based on current technology development levels and the assumption society won't have broken down and we're hunting each other for food.

But so what? Synapses aren't transistors, they can make and remake their own connections. It's a bit of a red herring. True, there is a lot of promising work in using different types of computer to mimic the human brain but there's another fundamental block to true AI - software.

Our software comes from so many sources that it would be nearly impossible to build a similar system for computers, particularly as we operate at such a fuzzy level of logic. It will be a very long time indeed before this problem is cracked and I suspect I will not live to see HAL refusing to open my front door.

3. The Internet was developed to survive a nuclear war

Iain Thomson: This piece of technology folk lore has been trotted out so many times it's become received wisdom.

The argument goes that the military developed the Internet protocols so that in the event of a nuclear attack damaged parts of the network would be automatically routed around and data flows would continue allowing for retaliation and the eventual triumph of the West etc. It's a nice story, but unfortunately it's not even remotely true.

How am I so sure of this? Well, a few years ago I was fortunate enough to share a two hour taxi ride with Bob Taylor, one of the creators of the Internet's precursor ARPANET. I asked him about this and he had a good chuckle. Yes he said, it was possible that that excuse was made at the time by some official to Congress in order to get funding but it was rubbish, and a logical impossibility when you thought about it.

For electronics the most damaging thing about a nuclear war is not the blast itself, which has only local effects, but electromagnetic pulse (EMP). The minute a nuclear device goes off the EMP blast knocks out pretty much everything in sight for miles around.

Both US and Soviet war plans called for the detonation of nuclear warheads in space over each other's countries to cripple as much infrastructure as possible. About four or five relatively small warheads detonated over the United States would destroy around 90 per cent of unprotected electronics in the country. The Internet would have had no chance.

Shaun Nichols: "Beating the Reds" was a bit like a magic phrase when it came to securing research funding in the 60s and 70s. This is likely how the idea got started.

Given that most of the early infrastructure for the Internet would have had trouble making it through a big earthquake, thinking that the Internet could emerge unscathed through nuclear Armageddon is rather laughable.

Add to this that by the 60s most of the country's technological hotbeds, places like Minneapolis, San Jose and Boston, were among the highest-priority nuclear targets for the USSR, and you'd have to get a pretty bleak outlook for ARPANET should WWIII have ever broken out.

On the plus side, you can rest easy knowing that should the Internet ever gain self-awareness and set itself on eliminating the humanity a-la SkyNet from the Terminator movies, it most likely wouldn't survive either.

2. More CPU power = more speed

Shaun Nichols: This is a misconception that has spanned two eras. In the 90s and first half of this decade, the thinking was that higher clock speeds translated to pure performance - that twice as many MHz meant twice as fast in practice. Then dual-core chips came along and it changed to twice as many cores means twice as fast.

While this is convenient marketing jargon, it's also pretty bad measurement and not at all accurate. The CPU chip is one of many components of a PC, and as such is also one of many bottlenecks. Things such as memory and hard drive speeds can have just as much or more of an impact on a system's overall speed than a processor.

The multi-core argument only further muddies things. While two cores are of course faster than one, they're not always twice as fast. Certain instructions, for example, need to be processed in such a way that they simply can't be run in parallel, effectively limiting many operations to being single-core functions.

Perhaps the problem is that the CPU is the most macho of all the computer parts. Many of us nerd types have to fight off the urge to let out a big Tim Allen "cave man" grunt when rattling off the specs for our quad-core powered beasts. The fact is, however, that the CPU ain't the only star of the speed show.

Iain Thomson: Shaun has this spot on. For years the computer industry, both processor manufacturers and system builders, staged a computing arms race in advertising and PR. Each increase in clock speed was hailed as a competitive advantage beyond price.

The prime example of this was the race between Intel and AMD to build the first 1GHz processor. One of my fondest memories is of one of Intel's spokesmen coming into the office just after AMD beat them to this mark. Obviously my first question was how he felt about losing. He looked my in the eye and said "Well you know Iain, speed isn't everything," and managed to keep a straight face - with a little effort. I'm not surprised that he's now running the UK operation - that took balls.

As you rightly point out however processor speed is little to do with overall performance. Cache sizes, graphics capability and hard drive access times all play their part. Software too is critical - code has to be written to perform on multi-core systems and older software won't see much of a speed bump.

The shift in emphasis from processor speed is no bad thing. I was a little ashamed reading your description of attitudes towards it. I, and I suspect a fair few readers, have displayed such sad characteristics. Yes, the phrase "Oh, you're running a 486, how retro!" has crossed my lips.

1. Virus companies write most malware

Iain Thomson: If you want to make a security software specialist spitting mad trot this one out at him or her. I've heard it everywhere, even from rational people who understand a little about computers. It's not true and never has been.

There are actually very few proper malware writers. Until recently the vast majority of attacks came from script kiddies, who took someone else's malware code, tweaked it slightly and then released it into the wild. This has changed slightly as malware has become more about profit but it is still the case.

Antivirus specialists are adept at spotting the hallmarks of the true virus writers, and if one of them started writing the stuff themselves it is highly likely that they would be spotted fairly quickly. But this ignores the key point about this myth.

The teams of antivirus researchers in the industry are driven people, in a way that makes the average coding geek look like a stoned slacker. They see themselves as the thin blue line between computers succeeding and failing and take unusual steps to do so. It's one of the few industries where competitors share secrets.

Once a signature file for a specific piece of malware has been developed it gets emailed to all competitors who also share information (which is almost all of them - even Microsoft). That means that whichever security software you use you get roughly similar protection.

So what I hear you say, there are cases of firefighters who set fires just so they can be a hero and put them out. Well yes, but if one researcher suddenly started solving all these signature files without a good explanation then questions would be asked.

Shaun Nichols: This myth is insulting to both the good and the bad guys. I think a large part of it comes from a misunderstanding as to the nature of vulnerability disclosures and proof of concept code.

What usually happens is that a researcher discovers a vulnerability in a product. Said researcher then either directly contacts the company or contacts a third party, such as a TippingPoint, who then passes it on to the company who patches it. The researcher then usually releases a sample "proof of concept" script to show that he or she actually did find the flaw. 99 per cent of the time, this is done before the public even knows about the flaw.

This, to some people, seems unethical. Why would one try and create ways to attack a system? The answer is because the bad guys are really smart people too. The "white hat" researchers who find and report vulnerabilities for a living are plugging holes that those who create malware and attack kits would otherwise find in time and exploit as "zero day" attacks for which there are no fixes.

The bottom line is that the bad guys really don't need any help in finding flaws, and getting a vulnerability out in the open is almost always better than sticking your head in the sand and hoping nobody writes an exploit.

Previous Page 1 2 Single page
Got a news tip for our journalists? Share it with us anonymously here.
Copyright ©v3.co.uk
Tags:

Most Read Articles

Qld tables $1 billion for major whole-of-government tech overhaul

Qld tables $1 billion for major whole-of-government tech overhaul

WA Police Force to spend $30.8m on IT 'optimisation'

WA Police Force to spend $30.8m on IT 'optimisation'

TAFE NSW, NESA land tech funding in state budget

TAFE NSW, NESA land tech funding in state budget

Transport for NSW restructures tech division

Transport for NSW restructures tech division

Log In

  |  Forgot your password?