Top 10 IT myths

By

Page 1 of 2  |  Single page

This week, we take a look at some of the more prevalent urban legends of Silicon Valley. Some have a basis in truth, while others are just a good tale to tell..

Let us know if you have any favorites we've missed.

Honourable mention: Macs cost more than PCs

Top 10 IT myths

Shaun Nichols: This one only made honourable mention mainly because, well, it's true on some level. You can get a Dell or Gateway notebook or desktop PC for less than an iMac or Macbook.

The catch is, you also get less hardware. Apple likes to load even their low-end models with a certain amount of power and connections. If you were to check the option boxes for all the bells and whistles on a Mac, the cost difference shrinks dramatically, and in some cases the PC is even more expensive.

So the rub here is whether you actually do want and need the extras found on the Apple computer. If not, then it is cheaper to go with a PC. Pound for pound, however, the idea that Apple arbitrarily prices their systems higher is wrong.

Iain Thomson: Apple has always concentrated on the high end of the computer market because, I think, it likes making quality products and that costs money.

But I have to say that a quick trawl through web sites shows that you do seem to get less for your money from Apple. Looking at base specifications at Apple MacBook Pro 17 inch shipped to California sells for US$3,036.92. Pretty much the same machine (although it's not as pretty), a Dell Studio 17, costs a touch over US$2,000, albeit with a US$375 sale discount.

In trying times like these that's a big saving and it's difficult to see how Apple justifies the price.

Honourable mention: Pirated material makes up the bulk of Internet traffic

Iain Thomson: In one legal case after another this same statistic gets trotted out – pirated material, especially torrents, makes up the vast bulk of Internet traffic.

This is a highly useful fact to trot out if you are trying to limit some people's bandwidth for example, or press for the gaoling of a suspected software pirate.

But how true is it? I've heard figures of 50, 60 or even 70 percent but when it comes to specifics people seem less certain.

The fact is no-one's particularly sure, and those that are aren't telling, or at least providing the data to back up some claims. One court case in Canada forced an ISP to confess that in fact such material made up less than ten percent of traffic.

What also makes this claim highly suspicious is the fact that not all files sent by torrents or peer-to-peer systems are pirated.

Most Linux distributions are sent using these methods and plenty of people, myself included, use the technology to send large files between systems. Until I see hard data I'm treating this with a pinch of salt.

Shaun Nichols: When Comcast laid out their plan to cap bandwidth usage, they said that in order to reach the limit, a user would have to download an average of three full-length movies per day. As few people download entire movies, and even fewer do so every day, the idea that pirated media traffic is really clogging systems to the tune that some companies are crying seems improbable.

Maybe this has a little more to do with copyright law than it does actual bandwidth problems. Cable providers certainly don't want to find themselves in the crosshairs of the MPAA or the RIAA for "enabling" users to pirate material. Being able to claim that P2P traffic is clogging the tubes is a nice excuse to discourage users from pirating movies and songs.

10. Companies must replace their systems every two or three years

Shaun Nichols: IT salespeople avert your eyes now… the idea that all systems need to be replaced every couple of years is hogwash.

The nature of the IT world is that the latest and greatest is a necessity you can't live without. The reality, however, is that if you outfit an average business with cutting-edge technology every couple of years, you're wasting a ton of cash.

Save for industries like biotech or computer animation that require every bit of power they can get, the latest technology is often much more than is needed. If well maintained, your average employee can get by just fine on a system that is five or even ten years old.

Think for a second about the programs most businesses use every day: an Office suite, a web browser and an email client. None of those are really the sort of thing that will stress out a decent PC built in 2005 or 2006. With budgets being slashed left and right these days, opting to add a little extra RAM or re-install Windows rather than simply buying a sleek new computer is something more buyers should consider.

Iain Thomson: The replacement cycle was actually a necessity at one point but I can't see it being relevant today.

In the 80s and 90s, when processing power really was a problem for day-to-day applications, there was a case for a two or three year upgrade cycle. Processor technology was in its infancy and you could get serious lag trying to crunch a spreadsheet.

However, for most of today's functions, processors have gotten so fast that they exceed the bounds of what is needed in all but the most niche of areas. If a netbook running an Atom processor can do 90 per cent of what you need it to then why bother getting the latest Nehalem systems?

Actually there is a reason – future proofing. If you're buying a new computer you want it to last as long as possible. Back in the day this meant it would be out of date in a few years but this is less true now, particularly if you are prepared to be flexible with software. I have a desktop system built in 1999 that runs Ubuntu quite happily and will continue to do so until a critical component fails.

9. Bill Gates is a whiz-kid programmer

Iain Thomson: Bill Gates is many things - billionaire, philanthropist, businessman, anti-fashion (and initially anti-grooming) icon and a mean poker player – but a master programmer he is not.

Gates certainly can program better than the bulk of the population, and at one time was probably one of the top thousand programmers in the world, but that time was in the 70s when programmers were distinctly thin on the ground.

In fact, most of Microsoft's early success was down to Paul Allen, who was an excellent programmer in the most artistic of ways. It was Allen who built the emulators that allowed Microsoft to test its software and he came up with many of the early tricks that made Microsoft so successful. By the time he stepped down to deal with illness Allen had also recruited a cadre of similarly gifted folk who took his work forward.

Gates is a brilliant businessman who saw the market opportunity opening up for him and exploited it ruthlessly and to great effect. But his business acumen is much better than his programming skills.

Shaun Nichols: This myth cuts both ways. While some people may praise Gates as a genius by mistakenly thinking he's responsible for the bulk of Microsoft's early code, just as many people like to personally put him at fault for everything that doesn't work in Windows. As if Bill Gates somehow left bugs in Office or designed the new Vista interface on his personal console in his office.

The truth about the origins of Microsoft's biggest products does in fact make Bill Gates look like a genius, but not for his programming. Gates made some brilliant decisions regarding the negotiation to buy and sell the rights to the code for things such as DOS, and he is said to have personally spurred the company to develop Windows after seeing the first Macintosh. That doesn't mean, however, that Bill was down in the trenches writing the source code at three in the morning.

Bill Gates is without a doubt a marketing and management whiz kid. But a programming genius? Not so much.

8. Macs aren't compatible with anything

Shaun Nichols: Macs run OS X, PCs run Windows. Therefore, if you want to run a Windows-only program or document, you can't use a Mac, right?

Wrong. Since Apple went to Intel chips for its computers, they have essentially been dual-boot machines. Apple's own Boot Camp tool allows you to create a Windows partition on your Mac that runs Windows just as it would on a machine made by a PC vendor.

If you don't want to reboot, products such as Parallels or VMware Fusion will load up Windows programs right from the Mac desktop. Even if you don't want to install Windows at all, most of the big programs, such as Office and Photoshop, have file formats that seamlessly switch between the Mac and PC versions.

This is old hat for anyone who follows the computing world even casually, but many would-be buyers still don't realise just how much has changed over the last five years in regards to Mac/PC compatibility.

Iain Thomson: This myth does have groundings in truth. It used to be hell trying to take data across platforms and I still have a stack of discs that refuse to let data go because of formatting issues.

Credit where credit is due, however, because there's seldom if ever a problem these days. Manufacturers have recognised that hoards of annoyed customers are less valuable than trying to lock people into a particular system.

7. Computers last longer when left running

Shaun Nichols: Not sure where this one started or how it got perpetuated, perhaps by someone who was really into flying toasters.

The idea is that the process of shutting down and starting up takes such a strong toll on a computer that it is in fact better for the machine if you leave it running non-stop.

This is of course utter nonsense. Starting up really doesn't put any more of a strain on a system than anything else, and in fact regular shut-downs and restarts actually help clear out memory and can prevent crashes.

Never mind the obvious waste of energy that comes from leaving computers on all night, extra work can also make moving parts such as fans die faster. You're also sucking more dust into the computer. So you're doing your PC more harm than good by leaving it running all night.

Iain Thomson: Actually, in the early days of computers this was true, and I suspect the ageing sysadmin population is keeping this myth going.

Back in the dawn of computing it did make sense to leave your computer on. The reasoning was that computers heat up when they are in operation, not just the processor but also connecting wires and components.

When the system was shut down the components cooled and by turning a computer on and off you greatly increased the thermal stress on the hardware itself with the constant cycle of hot and cool. But manufacturers got smart to this decades ago and it is no longer the case that systems should be kept running.

6. Hackers could bring about World War Three

Iain Thomson: The 1983 film Wargames did a lot of good things – sparking my interest in computers and Ally Sheedy for one thing – but I could kick the scriptwriter sometimes for the fears that the film invoked. The film was about as useful for home computer users a 'The birth of a nation' was for racial harmony.

The chances of anyone being able to use a standard modem to break into missile command and launch missiles, or even to access the supercomputers that control them, are almost exactly zero, particularly considering the technology of the day. After Wargames came out anxious parents were reportedly ripping computers away from their precious little snowflakes in case of accidental Armageddon.

The command and control systems used by the military are some of the most locked down computer networks on the planet. The military is paranoid enough to take any kind of networking extremely seriously (something we should be grateful for) and hackers stand no chance.

Actually the chances of accidental nuclear war were very high. In September 1983 civilisation nearly went up in smoke after a faulty Soviet satellite detected multiple missile launches from the US. Only the quick thinking of the controller, Colonel Stanislav Petrov, saved the day, which is why I raise a glass of vodka to him every September 25th.

Things were no better on the American side. When nuclear warheads were originally delivered to the US military they needed an eight digit arming code to be activated, the kind of thing that comes in handy red flashing numerals in James Bond films. What was the code used? 00000000. Words fail me.

Shaun Nichols: In the 80s and 90s there was a bit of romance for the persona of the dashing young hacker who could infiltrate the most secret of computer systems without being detected. Surely the public's lack of technological knowledge in the early days of the Internet helped to fuel this.

What people never really considered was the billions of dollars spent each year by both governments and the private companies that they contract with on intrusion detection and prevention systems, and the speed with which authorities will respond to possible threats. Just look at what's happening to Gary McKinnon just for hacking some NASA machines. Imagine the response had he accessed a truly highly-sensitive system.

As Iain also points out, human error or technological failures are far more of a concern than the ability for one rogue hacker to somehow infiltrate any sort of system that controls nuclear weapons.

Read on to page two for the top five

Next Page 1 2 Single page
Got a news tip for our journalists? Share it with us anonymously here.
Copyright ©v3.co.uk
Tags:

Most Read Articles

Transport for NSW restructures tech division

Transport for NSW restructures tech division

Vic firefighters doing battle with IT outages

Vic firefighters doing battle with IT outages

Ex-CBA IT exec sentenced to 3.5 years in jail for bribery

Ex-CBA IT exec sentenced to 3.5 years in jail for bribery

Lockheed Martin's IT business nears $7bn sale

Lockheed Martin's IT business nears $7bn sale

Log In

  |  Forgot your password?