Often unsung and underappreciated, their own personalities or lifestyles usually keep them from gaining greater public recognition.
You won't find any smooth-talking chief executives or business masterminds who built computing empires on this list (that comes next week). These people are the geek's geeks. They are the truly magnificent eggheads that worked their magic on the most basic levels, from invention and development to silicon and command lines.
With so many great minds to choose from, it was all but impossible to narrow this list down to ten but after considerable argument we’re managed it, nearly. So at the end you’ll find a couple of honourable mentions – it was either that or a fight would have broken out.
1: Linus Torvalds
Shaun Nichols: It's one thing to write a complete computer program. It takes an entirely different level of geekdom to write a computer operating system. Compound that with the fact that said operating system is a re-write of the most successful network operating environment of all time, and then - as the icing on the cake - giving the code away to the rest of the world for free and you'll start to understand why Linus Torvalds earned the number one spot.
One story jokingly tossed around is that Torvalds wrote all of the code for the Linux kernel on the back of a shovel over the course of one cold Finish winter. The reality is only slightly less amazing: the 21 year-old student wrote the first versions of his Unix clone in his spare time as a way to connect with servers at the University of Helsinki in 1991.
Over the next seventeen years, Linux developed into one of the most popular operating systems in the world, both for enterprise server and client computing. Many people interact with Linux machines on a daily basis without even knowing it.
Perhaps even more important, Linux provided the spark which ignited the developing open-source software movement, which has now redefined the idea of distribution and arguably helped form the basis of collaborative principles on the web which would later become social networking and hosted collaboration tools.
Though Linux now has thousands of developers, Torvalds remains its ultimate guardian, keeping in the direction of the kernel's development. He remains revered in nearly every circle of the computing world, from servers to clients to IT infrastructure and management, Linus Torvalds reigns supreme in today's order of geeks.
Iain Thomson: In many years of travelling to Finland I have come to love the place, and Torvalds exemplifies why. It’s a nation of bloody minded individuals who recognise that while this is good up to a point then working together is also essential.
Coupled with this there is a distain for the material lifestyle in that great country. Had Torvalds been raised in America he might very well be a billionaire but instead he lives modestly and takes his wealth in knowing he has changed the computing landscape for ever.
Torvalds was always going to be in our top three but for consistent and long term brilliance we decided he had to get the top spot. We have yet to see the full effect of Torvalds’ invention on the computing sphere but it’s going to be fun watching it develop.
2: Steve Wozniak
Shaun Nichols: It can be argued that nobody did more to bring about the advent of home computing than Apple. While Steve Jobs was the marketing mastermind who brought the whole thing about, Woz was the engineering muscle who developed the company's first products.
Back in the mid-70's, there really was no such thing as a home computer. While a terminal could be installed in a house, there wasn't much of a reason. Wozniak changed that by developing a machine that was easy to understand, use and most importantly, was cheap to build and buy.
Not only did Woz design and build the first Apple computer, he also wrote the operating system for the company's machines until the early 80s – by hand with pencil and paper.
But the real geek appeal of Woz is that, despite it all, he hasn't lost touch with his roots. He still has the air of a guy who stays up all night watching Red Dwarf and hacking out code just for the hell of it. Woz enjoys Segway polo and he's even shown up in line for Apple's iPhone releases.
Whether you're an Apple fanboy or a Mac-hater, you have to love Woz. Truly, the man is the people's geek.
Iain Thomson: I may not be Apple’s biggest fan but Woz is a legend and deserves the number two spot. He’s a painfully shy man still, with only a deep love and understanding for technology. As he himself has said, if he hadn’t met Steve Jobs he’d probably be living alone, with a small cubicle at HP where he would craft elegant devices and code.
He is still for many the king geek but if you look at how long he actually stayed in business his effect over the last few decades has been relatively minimal, as he has devoted himself to teaching and other projects. Nevertheless, the man is a giant in a world of technological pigmies.
3. Sir Tim Berners-Lee
Iain Thomson: It is not an exaggeration to say that if it wasn’t for Berners-Lee you might not be reading this article.
Berners-Lee, while at CERN, developed the basic protocols of the World Wide Web, initially just for internal use. But when the brilliance of the invention became clear he devoted his life to its growth and spread, even if he did make a few mistakes along the way (see Geek 5).
Had he patented his invention Berners-Lee would probably have wealth equalling that of a small country. Instead he gave it away, recognising that profiting from something like this would inhibit the growth of a technology that had the potential to revolutionise human affairs.
Some younger readers may not remember a world without the internet but those of us who do recognise the enormous changes it has wrought. The world would be a poorer place, both intellectually and financially, without it.
He’s still active in nurturing his invention, against internet monitoring by ISPs and a of net neutrality. He’s also working on the next generation of the internet, the semantic web.
Shaun Nichols: Sometimes it can be hard to comprehend the scale to which certain technologies can influence the world. The tens of billions of dollars in commerce which stemmed from Berners-Lee's work is simply mind-blowing.
It's also a testament to the work done by CERN and its benefits to the rest of society. Had it not been for the LHC, most people would not have the slightest clue about the organization and what it has done.
4: Seymour Cray
Shaun Nichols: Perhaps nobody was more influential in the advent of supercomputing than Seymour Cray. From ERA to Control Data Corporation to Cray Research, his designs became the backbone systems of laboratories like Livermore and Los Alamos. Many of his innovations became basic concepts for supercomputer design.
Legend has it that at the age of ten, Cray used an erector set to construct a Morse decoding tool. Later in his life, Cray spent his free time redesigning the logic circuits for his computing systems.
Iain Thomson: Cray is similar to many on this list in that he was never happy at the head of the company. The true geek never is, because there’s so much to distract you from the really important stuff – doing it better than anyone else.
Although his machines were often seen as the backbone of the military/industrial complex they were more often used for good. We would be lacking many great medical treatments and materials, or even unaware of the changing climate, without his machines.
5: Marc Andreessen
Iain Thomson: Marc Andreessen, with Eric Bina, was responsible for the first mass market web browser, Mosaic.
As an unassuming computer student he learned of Berners-Lee’s invention and decided to design a browser that would make the internet easier for people to use. This led to a brief conflict with Berners-Lee, who was peeved that Andreessen had included the ability to view pictures online, as he thought it should be for text only.
Mosaic was developed into the Netscape Navigator browser and for years was the de facto application for accessing the internet. It so terrified Microsoft that Redmond was forced into a dirty war against it that would later see the software giant fined hundreds of millions of dollars.
Since the crushing of Netscape Andreessen has devoted himself to a series of new start-ups, each with a good idea, and each bought up by other companies. He lives the life of the itinerant geek, but his name on a speaker’s list will bring admirers by the thousand.
Shaun Nichols: Andreeson may be considered one of the more successful geeks on this list, even despite getting run out of town by The Borg.
He more than most truly seems to embrace and appreciate his position, and he's not afraid of a good spot of publicity. Marc Andreeson is that rare mixture of computer geek ingenuity and CEO ambition. Unlike others on this list, he has not only a knack for developing great products, he also has the ability to turn them into a solid business plan… for the most part.
Iain Thomson: While there have been plenty of women who have made a mark in the computing field Grace Hopper is preeminent.
In the Second World War Hopper left her comfortable student life for the US Naval Reserve and started programming, first with the Mark One computer and then with UNIVAC. She turned down a professorship and spent the next 30 years as a leading light in the IT world.
She developed the first ever complier and pioneered the idea that programming could be done in English, not machine code. She invented the FLOW-MATIC language to prove her point, which in turn was key to the development of COBOL. She even popularised the term ‘bug’ to describe a software glitch, named after the moths that would occasionally block computer relays.
During the seventies she retired several times, always to return to duty when needed and in 1983 was named Rear Admiral by special Presidential appointment. After she left the navy she worked as a consultant for DEC until her death at 85.
Her influence is still felt today. The Hoppers are a group of female Microsoft employees some 3,000 strong who sponsor a scholarship in her name and visitors to Arlington can rest their feet and enjoy the sun in Grace Murray Hopper Park.
Shaun Nichols: Grace Hopper's story should be mandatory learning for every science class. With the dearth of women in IT, reaching girls at the high school or even elementary school level should be a major priority.
Not only is Hopper remarkable for her ability to penetrate what was completely a 'good ole boys club' in computing with the US Navy, but her assertion that programming should be done in common English proved to be what was needed to make computing accessible to most companies and individuals.
7: Jack Kilby and Robert Noyce
Shaun Nichols: Both great minds in their own right, are forever joined at the hip in history for their unintentional collaboration on one of the most important inventions in computing history, the integrated circuit
Kilby invented his integrated circuit in the summer of 1958 while the rest of the staff at Texas Instruments were on summer holiday. The rookie engineer used a single block of germanium to build all of the necessary components for an electronic circuit, eliminating the need to solder thousands of components to a single board when building computing circuits.
Some six months later Intel's Robert Noyce, unaware of Kilby's work, constructed a circuit from a single block of silicon. Noyce's methods later became the first microprocessor and enabled computing as we know it.
Both men deservedly earned Nobel prizes for their work.
Iain Thomson: This was a tough call. There was some discussion about replacing Kilby and Noyce with but in the end Shaun won out. Moore may have been a genius engineer, but these two were the spark that set off a revolution.
Without these two computers might still be the size of rooms, with huge oil filled cooling baffles and highly fragile mechanics. Arthur C Clarke once noted that the microprocessor significantly slowed manned space flight, since before the microprocessor many astronauts would have been needed in orbit to man a spacecraft’s computer systems.
8: Alan Turing
Iain Thomson: It is not an exaggeration to say that Alan Turing and his team at Bletchly Park saved millions of lives in the Second World War and ultimately paved the way for the technology industry as we know it. Not bad for barely a decade’s work.
As a brilliant young mathematician Turing foresaw that the time had come for mechanical computer and was key to their design and philosophy. He developed the code-breaking Bombe from its Polish original and was an important player in bringing together the team behind Collosus, arguably the world’s first programmable computer.
He was also key in the development of the idea of artificial intelligence (AI). The Turing Test, whereby a computer must that it is also human, is still one of the standards of AI today.
Turing may have had much more to offer the world but after being outed and then persecuted for his homosexuality he committed suicide in 1954. Had he lived the computer could have been completely different to today’s machines.
Shaun Nichols: When you consider how many thousands of great minds and millions of hours have gone into developing the technologies we have today, visionaries like Turing simply become giants.
This wasn't your typical story of two grad students in comfortable garage somewhere in Northern California. Turing was able to construct machines that were beyond anything ever seen while under the threat of Nazi invasion.
As Iain pointed out, it's an absolutely terrible shame that such a brilliant mind was taken from the industry so early.
9: Richard Stallman
Shaun Nichols: If Linus Torvalds provided the spark to set off what has become today's open source software movement, then Stallman built the fireplace, chopped the logs, arranged the kindling and doused the whole thing with lighter fluid.
While at MIT, Stallman sought to preserve the 1970's 'hacker' culture by creating standards for free software that would later become the GNU project. Additionally, he was a heck of a programmer. Among his creations are the stellar GNU compilers and the Emacs project.
Iain Thomson: I would have liked to see Stallman higher on the list but there was such stiff competition that he had to move down.
Stallman’s a tough old soul, a and a deadly enemy, but is firmly committed to his goal – free software for all. It’s the kind of mind that had it been applied to politics could have lead to a revolution, or more likely, a blindfold and firing squad.
He’s a persistent of the commercial software industry but proof positive that such people are needed and wanted by the rest of the world.
10: Paul Allen
Iain Thomson: Without Paul Allen it is fair to say that Bill Gates would not have been the richest man in the world. Allen convinced Gates to drop out for Harvard to set up Microsoft and was twice the programmer Gates ever was.
He was key to Microsoft’s early success, but was never as driven as Gates, once falling out with him after he skipped work for a day to go and see the first ever Space Shuttle launch. He was also key to Microsoft buying QDOS, which the company transformed into DOS, the cornerstone of its success.
After a bout of cancer he retired and used the enormous wealth he had helped create to fund other important technology companies, notably AOL and the commercial space operation Scaled Composites, which won the X Prize.
He now devotes much of his time to philanthropy, and although his mega yacht and personal submarine (painted yellow) are not traditional geek accoutrements it’s difficult to begrudge him some toys.
Shaun Nichols: Paul Allen is Redmond's Woz. Not only did he play the classic geek role as the unsung hero whose genius drives the company through its earliest and hardest times, but like Woz, he aged well… for a geek. Even in lavish wealth, you get the feeling this is a guy you sit down and discuss Monty Python with.
As admirable as the Bill and Melinda Gates foundation is, Allen was there first. While Gates was still locked in on his quest for world software domination, Paul Allen was setting the blue print for how tech millionaires should give back to both the scientific community and society as a whole.
Honourable Mention: Curt Herzstark
Shaun Nichols: In 1970, the first electronic miniature calculators emerged. What many people didn't realize, however, was that a calculator you could hold in your hand had been available for more than 20 years prior.
Curt Herzstark invented the Curta mechanical calculator prior to World War II, then perfected the design while being held prisoner in the Buchenwald concentration camp. To this day, the cylindrical crank-operated Curta calculator is an engineering masterpiece.
Iain Thomson: Herzstark’s invention literally saved his life, as without it it is unlikely he would have made it through the war and would have joined the six million Jews and five million others who fell victim to the Holocaust.
After the liberation he then had to flee the Soviets before he could finally see his calculators made in Lichtenstein. For a phenomenally tenacious geek like Herzstark the invention of the electronic calculator must have been a hard blow indeed.
Honourable Mention: Randall Munroe
Iain Thomson: OK, mea culpa on this one, I fought long and hard to get Munroe into the top ten but just couldn’t justify it, so we settled on the Honourable Mention.
Randall Munroe was a NASA contractor who in 2006 came up with the idea of a web-based comic dealing with technology, philosophy, relationships and, occasionally, velociraptors. In the last few years I have met very few people in the industry who do not have a favourite of his creations, even if some require knowledge of UNIX to get the joke.
He has spawned entire internet subcultures, including Rule 34 (if you can imagine something then there’s a web page of porn for it), is a passionate supporter of women online and is one of the most popular public speakers in the technology field.
Munroe still codes, pursues geek activities such as kite photography and geohashing, and lives simply from the sale of merchandise. I feel he will become to the IT industry what Charles Schultz was to the world.
Shaun Nichols: I was a bit skeptical about this pick until I sat down and thought about just how well Munroe is able to skewer the industry. While what he makes didn't change the face of computing by any means, it really hits at the heart of everything in the tech world.
Every industry needs a satirist, a clown, and a brilliant pundit. Munroe does all three amazingly well, while still keeping everything remarkable technically accurate.