Top 10 technology mistakes

By , on
Top 10 technology mistakes
Page 2 of 3  |  Single page

8. Facebook Beacon

Shaun Nichols: When making this list we tried to include actual software cock-ups rather than business decisions, but this one has a heavy dose of both.

Back in 2007 someone convinced executives at Facebook that they had to concern themselves with petty things like revenues. The solution was to construct a new system called Beacon that would combine advertising with traditional social networking features.

Dozens of e-commerce sites signed up to the service and began sharing purchase data with the site. For some reason Facebook just couldn't quite anticipate that people might have problem with having all their purchases broadcast over Facebook.

Not surprisingly, a major protest erupted and Facebook was eventually forced to kill off the ill-conceived and ill-deployed platform. Facebook is still struggling to win back user trust from the incident.

Iain Thomson: It's one of the major problems with start-ups in the internet age: you have a brilliant idea and lots of users, so how do you start making money out of them?

Facebook faced just such a quandary. It was the social networking flavour of the day (still is for that matter) but, with the empty husks of The Well, Geocities and Friendster, Facebook co-founder Mark Zuckerberg decided to start monetising the site. Facebook is all about sharing data, so he decided that its users should share their purchase choices online.

Advertisers loved the idea. Such a system would allow for a whole new range of integrated marketing campaigns. Facebook updates could start all manner of viral sales pitches that were a marketer's wet dream, and companies would be willing to pay through the nose for such data. Users, on the other hand, were less than enthralled.

Now, there's an old argument when it comes to data privacy that if you've done nothing wrong you've got nothing to hide. This may be true, but when it came to purchasing decisions a lot of people were less than happy at the prospect of seeing their data spread across the web.

Statistical probability suggests that only a tiny minority were concerned about what my granny would call "unsavoury" items, so the Beacon case was important for showing how many of us really do value our privacy.

7. Sony rootkit

Iain Thomson: In terms of the sheer anger this one raised I would liked to have seen it higher on the list, but those are the brakes sometimes.

In 2000, when the music industry was really panicking about Napster and music piracy, someone at Sony came up with a bright idea. Why not introduce some digital rights management software that let Sony know every time a disc was copied? The software to do this could be developed simply, and pirates could be stopped in their tracks. Five years later the plan went into effect.

On one level the plan worked perfectly, and the software did exactly what it said on the tin. But Sony management, despite obviously thinking this was such a wonderful idea themselves, neglected to tell consumers about the code and it was discovered by Professor Mark Russinovich, who published a blog post pointing out that this wasn't the best idea in the world after all.

The problem was that the code was a rootkit, and anyone who works in security hates rootkits because they have the nasty habit of being used to break nasty great holes in firewalls.

When the news leaked, Sony tried to brazen it out and the firm's global digital business president, Thomas Hesse, was famously quoted as saying: "Most people, I think, don't even know what a rootkit is, so why should they care about it?"

One security firm had t-shirts printed with that quote, and I took great pleasure in wearing one to Sony press conferences thereafter.

Corporate hubris isn't usually news, but the breathtaking arrogance, coupled with growing consumer fears about the security of their online bank accounts, made the rootkit big news.

Within a week virus writers were using it to break into systems, and Sony was forced into an embarrassing climbdown and had to pay compensation.

Shaun Nichols: Sony's rootkit fiasco was the 21st century version of the Brain virus. As Iain touched on, the problem was that companies became so obsessed with preventing piracy that the customers become the enemy, and any sense of respect or ethics for the people who buy and use their products flies out the door.

In Sony's case, the company decided that its customers didn't need to know about what was on the disk and what it would do to their systems.

It never seemed to dawn on the company that there was something wrong with lacing its CDs with code that would not only automatically install itself onto your system, but embed itself at the kernel level.

Rootkits are understandably a huge worry in the security community as they run at a level that normal anti-virus tools can't detect.

One would think that, if McAfee and Symantec are spending tens of millions of dollars in R&D to eliminate something, you probably shouldn't be tossing it into your product without telling anyone.

6. Apple III

Shaun Nichols: Not so much a programming error as an engineering gaffe, the ill-conceived Apple III flopped in the market and helped to lock Apple out of much of the business space.

Designed to succeed the wildly successful Apple II and appeal to the growing enterprise workstation sector, the 1980 Apple III was built to be rugged and professional, with a stylish metal casing, while remaining quiet by eschewing fans.

The idea was that the casing would act as a natural heat sink, drawing heat away from the components and keeping the system cool. Unfortunately for Apple the case design also meant that the chips on the motherboard had to be positioned close to one another.

With inadequate heat sinks and no fan to cool down the board, the system was prone to overheating and all of the problems that came with it. Floppy disks were often damaged by the internal drives, and warped chips became dislodged from the motherboard.

Apple's solution for the loose chip issue? Pick the computer a few inches off the desk and drop it. Not surprisingly, the Apple III sold poorly and was discontinued in 1984.

Iain Thomson: Apple was there at the start of the personal computer, but it had started to penetrate only a few key vertical business markets when IBM steamrolled in with the PC and Apple's fate as a niche system was set.

The Apple III was the firm's last attempt to stem the tide of history. With that in mind, you'd have thought they'd have turned out a better system than this piece of junk. To me it typifies everything that was worst about Apple at the time: the rule of style over function, closed systems and a half-arsed attitude to build quality.

The majority of the business world took one look at the Apple III and walked away laughing quietly. Yes, there are some design professionals and accountants who wax lyrical about it, but the majority of users think the system is best forgotten.

Only now is Apple making serious inroads into the corporate computing sphere and it's doing it not because of the quality of its computers but the excellence of its smartphones.

5. IBM Personal System/2

Iain Thomson: IBM's decision to get into the PC market really kick-started the idea that a computer could be on every desk and legitimised the mass computerisation of the workplace. The unofficial slogan of the company was 'No-one ever got fired for buying IBM,' and the company aimed to keep it that way in the personal computer sphere.

But by the third generation of the PC, the company was losing its grip on the market. Clever reverse engineering by Compaq and others had spawned a growing PC clone market, and businesses were proving distressingly keen to buy a working PC without an IBM logo on it for significant discount, rather than paying what Big Blue told them to.

So IBM introduced the PS/2, a completely new PC with a closed micro-channel architecture that would force the cloners to start again from scratch. Unfortunately customers would have to do the same since the compatibility problems were immense, but IBM figured it had enough clout to force the market to change. It was wrong.

Don't let it be said that the PS/2 wasn't innovative. It standardised the industry around 3.5in floppy drives for a start, and the round plugs you see at the end of old keyboards and mice introduced by the system (thus the name PS/2) lasted over a decade as the default standard.

But the fundamental mistake IBM made was in not realising that the days of hardware margins were gone. Once anyone could build a computer the money was in the software, and IBM had cheerfully handed that part of the PC industry to a bright young man in Seattle with personal grooming issues and big plans for the computer industry.

Shaun Nichols: PS/2 was an interesting idea that IBM came up with too late. As is often the issue with larger companies, IBM simply wasn't agile enough to keep up with the industry, and it took a huge hit when they tried to introduce PS/2.

While it might have been a great idea in the early 1980s, by the time IBM tried to introduce the new platform Microsoft was already taking charge in the market and doing so in a manner that welcomed software developers and hardware vendors.

While the PS/2 platform was a failure for IBM, in the long term it was arguably a very good thing for the company. Witnessing the crash and burn of PS/2 showed IBM that the market was changing and that, if it wanted to maintain its position, it had to rethink its approach.

Big Blue spent much of the 1990s dumping many of its hardware operations and focusing on the enterprise space with software and services. As a result IBM was able to keep its status as a pillar of the industry and arguably the most trusted name in the business world.

Read on to find out what made the top four!

Previous PageNext Page 1 2 3 Single page
Got a news tip for our journalists? Share it with us anonymously here.
Copyright ©

Most Read Articles

Log In

  |  Forgot your password?