The History of Computers

The History of Computers


The last two decades have shown an enormous and powerful growth in computer technology, but how did that growth begin?

You might be surprised to hear that the computer’s oldest archetype surfaced some 2500 years ago; if you count an abacus as a computer.

Why should you? Because like a computer, the ancient abacus was invented with the intention of helping humans work out repeated calculations faster than it would be possible to do with their brains alone.

The abacus was an invention so ahead of its time that it remained a used and relevant piece of technology for over 2000 years. It was until 1642 that French scientist and philosopher Blaise Pascal invented the first mechanical calculator that could perform addition and subtraction, even with decimal numbers. Gottfried Leibniz came up with a similar calculator a few decades later, but it took a step further by incorporating the stepped drum. Liebniz also invented binary code, which wasn’t relevant to his particular calculator but inspired other inventors.

george booleOne such inventor and thinker was Englishman George Boole, who in the mid 19th century invented Boolean algebra (a foundational concept behind computers’ use of long strings of binary code to make decisions).

The first person to truly attempt to make a computer (that is, a device that operates automatically by following a program) was the English mathematician Charles Babbage (1791-1871). Babbage’s machine actually possessed an input, a memory, a processor and an output, earning him the title of the “father of the computer.” His machines never worked, but the understanding of how one would work was there. He simply didn’t live in a time where the technology existed.

Herman Hollerith was more successful. He invented one of the first practical calculating machines, which he called a tabulator, in the 1880’s. The tabulator was intended to help Americans complete their yearly census, but Hollerith soon realized his machines had a variety of other uses. He founded a company called the Tabulating Machine Company in 1896 to start mass-producing his increasingly successful product. In 1924, he changed his company name to one you may recognize: International Business Machines (IBM).

Here the baton is handed to American scientist Vannevar Bush, who created the Rockefeller Differential Analyzer in 1935. Its assemblage involved 200 miles of wire and 150 electric motors because it was an analog calculator (meaning numbers are stored in physical forms as opposed to as digits).

turngAnother father of computing, Alan Turing composed a theory of how computers processed information that laid out the way a computer would process information. With the help of his paper, other inventors created a Turing machine, which was basically a simple information processor that works through a series of instructions, reading data, writing results and then moving on to the next instruction.

American physicists John Atanasoff and Clifford Berry created the first machine that used electrical switches to store numbers; an off switch implied the number 0, an on switch implied 1.

The first large scale version of that computer was at Harvard by mathematician Howard Aiken. The machine was 50 feet long and extremely noisy.

Ok, that’s all you get! Find out more online.

happy wheels 2