The top 10 greatest geeks of all time

By , on
The top 10 greatest geeks of all time
Page 2 of 2  |  Single page
6: Rear Admiral Grace Hopper

Iain Thomson: While there have been plenty of women who have made a mark in the computing field Grace Hopper is preeminent.

In the Second World War Hopper left her comfortable student life for the US Naval Reserve and started programming, first with the Mark One computer and then with UNIVAC. She turned down a professorship and spent the next 30 years as a leading light in the IT world.

She developed the first ever complier and pioneered the idea that programming could be done in English, not machine code. She invented the FLOW-MATIC language to prove her point, which in turn was key to the development of COBOL. She even popularised the term ‘bug’ to describe a software glitch, named after the moths that would occasionally block computer relays.

During the seventies she retired several times, always to return to duty when needed and in 1983 was named Rear Admiral by special Presidential appointment. After she left the navy she worked as a consultant for DEC until her death at 85.

Her influence is still felt today. The Hoppers are a group of female Microsoft employees some 3,000 strong who sponsor a scholarship in her name and visitors to Arlington can rest their feet and enjoy the sun in Grace Murray Hopper Park.

Shaun Nichols: Grace Hopper's story should be mandatory learning for every science class. With the dearth of women in IT, reaching girls at the high school or even elementary school level should be a major priority.

Not only is Hopper remarkable for her ability to penetrate what was completely a 'good ole boys club' in computing with the US Navy, but her assertion that programming should be done in common English proved to be what was needed to make computing accessible to most companies and individuals.

7: Jack Kilby and Robert Noyce

Shaun Nichols: Both great minds in their own right, are forever joined at the hip in history for their unintentional collaboration on one of the most important inventions in computing history, the integrated circuit

Kilby invented his integrated circuit in the summer of 1958 while the rest of the staff at Texas Instruments were on summer holiday. The rookie engineer used a single block of germanium to build all of the necessary components for an electronic circuit, eliminating the need to solder thousands of components to a single board when building computing circuits.

Some six months later Intel's Robert Noyce, unaware of Kilby's work, constructed a circuit from a single block of silicon. Noyce's methods later became the first microprocessor and enabled computing as we know it.

Both men deservedly earned Nobel prizes for their work.

Iain Thomson: This was a tough call. There was some discussion about replacing Kilby and Noyce with but in the end Shaun won out. Moore may have been a genius engineer, but these two were the spark that set off a revolution.

Without these two computers might still be the size of rooms, with huge oil filled cooling baffles and highly fragile mechanics. Arthur C Clarke once noted that the microprocessor significantly slowed manned space flight, since before the microprocessor many astronauts would have been needed in orbit to man a spacecraft’s computer systems.

8: Alan Turing

Iain Thomson: It is not an exaggeration to say that Alan Turing and his team at Bletchly Park saved millions of lives in the Second World War and ultimately paved the way for the technology industry as we know it. Not bad for barely a decade’s work.

As a brilliant young mathematician Turing foresaw that the time had come for mechanical computer and was key to their design and philosophy. He developed the code-breaking Bombe from its Polish original and was an important player in bringing together the team behind Collosus, arguably the world’s first programmable computer.

He was also key in the development of the idea of artificial intelligence (AI). The Turing Test, whereby a computer must that it is also human, is still one of the standards of AI today.

Turing may have had much more to offer the world but after being outed and then persecuted for his homosexuality he committed suicide in 1954. Had he lived the computer could have been completely different to today’s machines.

Shaun Nichols: When you consider how many thousands of great minds and millions of hours have gone into developing the technologies we have today, visionaries like Turing simply become giants.

This wasn't your typical story of two grad students in comfortable garage somewhere in Northern California. Turing was able to construct machines that were beyond anything ever seen while under the threat of Nazi invasion.

As Iain pointed out, it's an absolutely terrible shame that such a brilliant mind was taken from the industry so early.

9: Richard Stallman

Shaun Nichols: If Linus Torvalds provided the spark to set off what has become today's open source software movement, then Stallman built the fireplace, chopped the logs, arranged the kindling and doused the whole thing with lighter fluid.

While at MIT, Stallman sought to preserve the 1970's 'hacker' culture by creating standards for free software that would later become the GNU project. Additionally, he was a heck of a programmer. Among his creations are the stellar GNU compilers and the Emacs project.

Iain Thomson: I would have liked to see Stallman higher on the list but there was such stiff competition that he had to move down.

Stallman’s a tough old soul, a and a deadly enemy, but is firmly committed to his goal – free software for all. It’s the kind of mind that had it been applied to politics could have lead to a revolution, or more likely, a blindfold and firing squad.

He’s a persistent of the commercial software industry but proof positive that such people are needed and wanted by the rest of the world.

10: Paul Allen

Iain Thomson: Without Paul Allen it is fair to say that Bill Gates would not have been the richest man in the world. Allen convinced Gates to drop out for Harvard to set up Microsoft and was twice the programmer Gates ever was.

He was key to Microsoft’s early success, but was never as driven as Gates, once falling out with him after he skipped work for a day to go and see the first ever Space Shuttle launch. He was also key to Microsoft buying QDOS, which the company transformed into DOS, the cornerstone of its success.

After a bout of cancer he retired and used the enormous wealth he had helped create to fund other important technology companies, notably AOL and the commercial space operation Scaled Composites, which won the X Prize.

He now devotes much of his time to philanthropy, and although his mega yacht and personal submarine (painted yellow) are not traditional geek accoutrements it’s difficult to begrudge him some toys.

Shaun Nichols: Paul Allen is Redmond's Woz. Not only did he play the classic geek role as the unsung hero whose genius drives the company through its earliest and hardest times, but like Woz, he aged well… for a geek. Even in lavish wealth, you get the feeling this is a guy you sit down and discuss Monty Python with.

As admirable as the Bill and Melinda Gates foundation is, Allen was there first. While Gates was still locked in on his quest for world software domination, Paul Allen was setting the blue print for how tech millionaires should give back to both the scientific community and society as a whole.

Honourable Mention: Curt Herzstark

Shaun Nichols: In 1970, the first electronic miniature calculators emerged. What many people didn't realize, however, was that a calculator you could hold in your hand had been available for more than 20 years prior.

Curt Herzstark invented the Curta mechanical calculator prior to World War II, then perfected the design while being held prisoner in the Buchenwald concentration camp. To this day, the cylindrical crank-operated Curta calculator is an engineering masterpiece.

Iain Thomson: Herzstark’s invention literally saved his life, as without it it is unlikely he would have made it through the war and would have joined the six million Jews and five million others who fell victim to the Holocaust.

After the liberation he then had to flee the Soviets before he could finally see his calculators made in Lichtenstein. For a phenomenally tenacious geek like Herzstark the invention of the electronic calculator must have been a hard blow indeed.

Honourable Mention: Randall Munroe

Iain Thomson: OK, mea culpa on this one, I fought long and hard to get Munroe into the top ten but just couldn’t justify it, so we settled on the Honourable Mention.

Randall Munroe was a NASA contractor who in 2006 came up with the idea of a web-based comic dealing with technology, philosophy, relationships and, occasionally, velociraptors. In the last few years I have met very few people in the industry who do not have a favourite of his creations, even if some require knowledge of UNIX to get the joke.

He has spawned entire internet subcultures, including Rule 34 (if you can imagine something then there’s a web page of porn for it), is a passionate supporter of women online and is one of the most popular public speakers in the technology field.

Munroe still codes, pursues geek activities such as kite photography and geohashing, and lives simply from the sale of merchandise. I feel he will become to the IT industry what Charles Schultz was to the world.

Shaun Nichols: I was a bit skeptical about this pick until I sat down and thought about just how well Munroe is able to skewer the industry. While what he makes didn't change the face of computing by any means, it really hits at the heart of everything in the tech world.

Every industry needs a satirist, a clown, and a brilliant pundit. Munroe does all three amazingly well, while still keeping everything remarkable technically accurate.
Previous Page 1 2 Single page
Got a news tip for our journalists? Share it with us anonymously here.
Copyright ©

Most Read Articles

Log In

  |  Forgot your password?