History of computing

From Citizendium
Jump to navigation Jump to search
This article is developing and not approved.
Main Article
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
This editable Main Article is under development and subject to a disclaimer.

Writing about the history of computing is challenging because of the complexity of any one computer, the speed with which computer technology has evolved, and the many different types of computers that have been built. Further, since computing reaches into so many different industries (such as telephony, automotives, or cameras) and has spawned a huge industry for the making of hardware and software for computers, it is difficult to know where such a history should stop.

This article is organized in sequential order, to present the major events in the development of general-purpose computing devices, including the basic evolution of their hardware components and software concepts. The article is organized by decade. Technologies appear, not in the decade in which they were first imagined or in which experimental versions became available, but in the decade in which they became widely accepted.

Early devices (ancient times)

Long before the arrival of mechanical computing, ancient civilizations devised various methods to calculate and keep track of numbers.

Salamis Tablet (300 B.C.)

A very early counting device, the Salamis Tablet[1], was used by the Babylonians to track numbers in their society.

Abacus (300 B.C.)

The abacus, a mechanical aid to performing arithmetic, dates back many centuries and is still in use in various forms.

Mechanical computing (Renaissance to 1900)

The human longing for mechanical help in performing complex computations existed for a long time before technology was advanced enough to realize a practical solution. These are some of the attempts to create computing machines before technology was sufficiently advance to make that very feasible.

Codex Madrid (Da Vinci)

On 13 February 1967, the "Codex Madrid", written by Leonardo Da Vinci, was discovered in the National Library of Spain by Dr. Roberto Guatelli[2]. Inside the Codex Madrid was a drawing for an elaborate mechanical computational device. Guatelli noticed that a similar construct appeared in Da Vinci's "Codex Atlanticus". A prototype of this machine was created in 1968, and was observed that it exhibited traits that of a ratio machine. One revolution of the first shaft(10^1) invoked ten revolutions of the second(10^2), repeating until the last shaft which rotated at a rate of ten to the power of 13. Whether this was a true computational device was under some debate. Previously been displayed at IBM, the exhibit was removed due to a nonconsensus, and is presumed to be in one of IBM's storage facilities.

Pascaline (1642)

An early mechanical computational device is the Pascaline, created by Blaise Pascal circa 1642.[3] The Pascaline performed simple addition and subtraction. The concept of the Pascaline came about from the carrying of places by gear rotation. Functionally, the machine worked by increasing values on a single cog, which ranged from values 0 to 9. Upon the next rotation, a series of cogs would rotate the next gear over one iteration to read 1 while the first cog would reset back to 0.[4]

Charles Babbage (early 1800's)

It would take Charles Babbage, born on December 26, 1791 and inducted as a Fellow of the Royal Society to develop the first real successful automatic calculating machine[5]. In 1821, Babbage developed the Difference Engine No. 1, which was a functional machine designed to compile mathematical tables based on polynomial calculation.[6]. The difference engine's physical algorithm was based on a mathematical technique known as the Method of Differences, which Babbage contributed work on. Unfortunately only a fragment of the machine would ever come to fruition due to various financial disputes and accusations of fund mismanagement from the British Government. More importantly, the machine was never fully developed due to Babbage's realization of a more advanced machine called the Analytical Engine. Like the Pascaline, both the Difference and Analytical Engines relied on series of cogs and gears to compute values.

Functionally, the Analytical Engine was capable of various algorithmic operations that were broken down into basic algebraic operations. Two cards would be used to program the system: the first would detail what operations were required to be performed, and the second would contain the values to be operated on. In this sense, the Analytical Machine was much like a computer, having an input (the algorithm as described on a card), a processor (the machine), an output (the result), and memory (using a storage method--the cards themselves). Babbage's associate Ada Lovelace was arguably the first computer programmer, designing algorithms for the Analytical Engine.

Hollerith and punched cards (1884)

Herman Hollerith was born on February 29, 1860 in New York. In 1875 Hollerith attended the City College of New York, he graduated from the Columbia School of Mines in 1879 with an engineering degree.[7] After graduating, Hollerith took up work with the United States Census Bureau, and was appointed Chief Special Agent. Hollerith's contribution to computing was inspired by his work at the USCB, especially from Dr. John Shaw Billings who suggested that there should be a way to process the large amount of census data by some mechanical means. In 1884, Hollerith worked to develop a way to tabulate census information through the use of punch cards. Eventually, he recognized that cards could be used as storage medium for census data. His experiments lead to a process by where a pin would go through a hole in the card to complete an electrical circuit. His system by which cards could be read and tabulated on a mechanical counter through a circuit completion was called the Hollerith Electric Tabulating System. By 1890, the machines were improved so that a simple keyboard could be used to tabulate data instead of entry by hand.

Prerequisites to the first electronic computers (early 1900's)


The discovery of electricity, along with the invention of electronic switches in the early 1900's, including the solenoid and the switching vacuum tube, were necessary pre-requisites to the invention of electronic computers.

Computational theory

Much work on the theory of computation was done in the 1930s. Alan Turing and Alonzo Church both came up with new techniques to solve the halting problem, inventing the Turing Machine and lambda calculus respectively.

Switching algebra

Boolean algebra, invented by George Boole in the 1850's, is an algebraic system consisting of only two values, and it was destined to become the basis for describing digital logic circuits used to build electronic computers. The realization that Boolean algebra could be used to describe logic circuits was a major conceptual breakthrough first documented by Claude Shannon in his 1937 MIT master's thesis[8]. Shannon's thesis created a stir in the world of electronics when it began circulating in 1938[9], though Shannon was better known in later years for founding the field of information theory. Boolean algebra subsequently became known as "switching algebra" by computer designers.

1940s: The first electronic computers

Prior to World War II, the word computer generally meant a person who computes. But in the early 1940's (during World War II), the first electronic computers were developed to perform numerical calculations far faster than humans could. With the notable exception of the Zuse Z3, which was developed in Germany in relative isolation, the first electronic computers were mainly the result of secret military projects funded by the British and U. S. governments[10][11].

Zuse Z3 (1941)

German engineer Konrad Zuse built the Z1 computer between 1936 and 1938. German patent applications provide evidence of Zuse's development of a mechanical memory device in 1936, used in the Z1.[12]. Zuse built the Z2 sometime between 1936 and 1939, and the Z3 from 1939 to 1941[13]. The German government was not supportive of Zuse's work, and evidence of the Z3's existence was discovered by Allied forces only after the end of World War II. All photographs of the Zuse Z3 were destroyed by allied raids during the war[14]. Zuse's constructions incorporated advanced concepts, including the implementation of the binary numeral system. Having survived the war, Zuse built another computer in Switzerland, and later was the first designer to propose pipelining the computations of a computer processor. In 1949, Zuse formed Zuse KG, where he worked until 1966. Zuse's Z3 is now recognized as probably having been the first general-purpose electro-mechanical computer.

Atanasoff pre-computer (1942)

Between 1937 and 1942, Dr. John V. Atanasoff and graduate student Clifford Berry, of Iowa State University, worked on a prototype electronic computer that introduced key design ideas but which never completely realized as a general purpose computing device. Some of Atanasoff's ideas may have been communicated to John Mauchly, who later assisted in the development of the ENIAC.

Colossus (1943)

Britain's Colossus project produced a series of about ten electronic computers used by British code breakers to read encrypted German messages during World War II. The first Colossus prototype was initially completed by engineer Tommy Flowers in 1943 at the Post Office Research Station, Dollis Hill, with input from mathematician Max Newman and a few others. It used the binary numeral system for calculations, utilizing vacuum tubes and very fast optical punch tape readers. By 1944, the project moved to Bletchley Park and lasted until the end of the war. Shortly after, in 1946, Winston Churchill gave official orders to have the machines destroyed

Harvard Mark I (1944)

For decades after World War II, it was widely believed that the IBM Automated Sequence Controlled Calculator(ASCC) (completed in 1944 and later called the Mark I) was the first electromechanical general-purpose computer[15]. The idea for the Harvard Mark I automatic digital calculator was conceived by Howard H. Aiken, then a graduate student from Harvard University with a Ph. D. in theoretical physics. The machine was a hybrid of mechanical and electronic technology, performing calculations through a series of small gears, electro-mechanical counter wheels, and switches. Input occurred via punched cards, paper tape or through manually set switches to indicate the values to be processed. The output was generated by an electric typewriter or punched into additional cards. The successor to the Mark I, the Mark II, would still used relays, but also featured an electrical memory and a system of 'constant' values that were referenced during run-time.[16]

ENIAC (1946)

At the University of Pennsylvania, John Mauchly and J. Presper Eckert proposed the Electrical Numerical Integrator And Computer to the U.S. Army Ordnance Department's Ballistics Research Laboratory in 1943, and then served as its main designers until construction was finished in 1946. It was a military project justified by a need to compute ballistic trajectories, and was one of the earliest general-purpose, programmable electronic computers[17]. ENIAC's computations used the decimal numeral system, instead of the binary numeral system used by most subsequent digital computers. The ENIAC was not able to store its own program in memory; it had to be programmed by setting switches on function tables and by changing the wiring. Considerable human effort was required to reprogram it.

UNIVAC and EDVAC (late 1940's)

The designers of ENIAC jointly formed the Eckert-Mauchly Computer Corporation in 1946, which was bought by Remington Rand in 1950. In 1951, this company delivered the first U. S. commercial computer, called the UNIVAC, to the United States Census Bureau. The UNIVAC was a stored-program computer, like its non-commercial sister the EDVAC. The Electronic Discrete Variable Automatic Computer was funded by the U.S. Army Ballistics Research Laboratory and was built at the Aberdeen Proving Ground in Maryland. EDVAC was build by the University of Pennsylvania's ENIAC designers Eckert and Mauchly, and by John von Neumann and some others. The EDVAC was the first computer to implement the stored-program concept. The idea was first published in von Neumann's 1945 report First Draft of a Report on the EDVAC[18]. Although its design predates the UNIVAC, the EDVAC did not become fully operational until 1952. Competing fiercely with IBM, the company eventually built 46 of the earliest commercial computer systems.

1950s: Hardware and the first compiler

In the decades after World War II, computers grew rapidly in usefulness while decreasing in size and cost, spawning a huge and complex industry to create hardware and software for them. But in the 1950's, they were still huge and expensive and available only to a few people.


Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

First compiler

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

1960s: Batch operating systems

The 1960s brought an era of commercial corporate use of computers for the first time. Groups of people used punched cards or paper tape to write programs, and then a computer operator would take the pile of stacked cards that build up and submit them as individual jobs to a large computer.

Batch operating systems

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.


Multicians.org has a huge amount of info waiting to be mined on Multics

1970s: Networks, better software, and smaller hardware

DARPA and the early networks

Some of the critical key concepts included the concepts of packet switching from Paul Baran and Leonard Kleinrock, and of interconnected heterogeneous networks, first called catenets, by Louis Pouzin and Vinton Cerf, with J.R. Licklider providing project direction and government support. These concepts led, by the 1980's, to the Internet.

Time-sharing operating systems: the birth of UNIX and C

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Mini-computers: hardware gets smaller

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Computer programming languages

During the 1970s, a variety of new programming languages emerged - the C programming language being the one which is still widely used in system programming. In addition, a variety of higher level languages emerged: Charles Moore's Forth, the logic programming language Prolog, and the beginning of the competitive process to develop Ada was started, which would later become the language used for military and aviation programming. The Lisp dialect, Scheme was also created, which would later become the language used in Structure and Interpretation of Computer Programs, the introductory programming course for MIT undergraduates.

Many programming languages also became more standardized, with the American National Standards Institute (ANSI) publishing specifications for COBOL, FORTRAN 77 and MUMPS.

1980s: Networks and personal computers, oh my!

personal computers

The "big three" (Apple, Commodore and Tandy-Radio Shack) duke it out with the IBM compatible PC and Microsoft MS-DOS

The internet

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies. (Note: HTTP / the Web is specifically excluded from this decade, because the WWW came later)


electronic mail


TCP applications (FTP, gopher, telnet, WAIS, Archie / Veronica, etc)

1990s: WWW, and several software revolutions

Dial-up internet access for the masses

Many small companies, and some large ones such as CompuServe, sell dial-up internet access accounts to the public. Suddenly, anyone could get online and have email, although email systems were not always compatible with each other. Combined with the availability of home (personal) computers, internet usage grew astronomically.

The World Wide Web

In 1992, Tim Berners-Lee published the HTTP protocol and the HTML language, and the first web browser (Mosaic) became available. These led to what is now called the World Wide Web (or just WWW).

Object-oriented programming goes wild

Experimental computer programming languages based on object-oriented concepts were developed decades earlier, but in 1992, James Gosling and his team at Sun Microsystems began developing its Java platform which soon took the programming world by storm. Java provided an alternative to C++ for many, and is now widely used for programming on both the server and the client, as well as being taught as the first language for many students on computer science programs. Java is now a mainstay for enterprise computing. Java, like many languages before and since, uses a virtual machine architecture - the programmer writes in the Java programming language, which compiles to bytecode, which itself runs on a virtual machine that abstracts away the underlying platform. The idea is that it supposedly makes it simpler to write cross-platform applications, although in practice, this is rarely as easy as the early Java hype made it sound. Java's popularity spread widely, with Java variants appearing on mobile phones (JavaME), the enterprise server (JavaEE) and on web pages (the brief phenomena of Java applets) - to the point where some people started criticising Java's dominance[19][20][21].

There are also a wide variety of new programming languages appearing during the 1990s, often aimed at absolute beginners. Microsoft created Visual Basic which worked both on it's own and as part of the Office applications. 'Scripting' languages also became extremely popular: the Macintosh platform had AppleScript introduced, which has a "natural language" syntax designed for non-programmers (and which many programmers find cumbersome[22]). Python was a significant development, providing a simple scripting language that anyone could start using, but which had advanced object-oriented features. Ruby became popular later, offering some of the same benefits.

Open-source software shakes things up

After a decade of failed attempts to make the popular Unix operating system run on low-cost Wintel hardware, Linus Torvalds released the Linux operating system. Seeds planted in the 1980's by Richard Stallman's Open Software Foundation finally took root, and the open source software movement really took off, having a disruptive (and arguably positive) effect on the entire software economy.

XML, the universal translator

Another step on the road to operating system (platform) independence, this text-based standard for self-describing data had a huge impact. As major programming languages such as Java and C# developed XML parsers, these languages could finally exchange information with each other, and much more.

MP3: Honey, I shrunk the music

This compression algorithm led to digital music players and file sharing and much more. (need more here!)

after 2000

See also: Convergence of communications

Wireless networking

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.\

  • Wireless local area network [r]: The assumption, in networking technologies, about the characteristics of the user space they support, and, as importantly, the user spaces they should ignore [e]
  • Cellular telephony [r]: A set of techniques that let many low-powered portable telephones connect to the fixed network, often exchanging data and images as well as voice [e]
  • Satellite communications [r]: Telecommunications that makes use of a high-altitude relay(s), usually artificial satellites in Earth orbits but potentially a relay in the atmosphere [e]

Google, Maps, Mashups, Amazon, Yahoo

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

  • Mashup [r]: A data visualization created by combining data with multiple computer applications. [e]

Global Positioning System (GPS)

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

  • Global Navigation Satellite System [r]: A system which allows small electronic devices to determine their location (Longitude, Latitude, and Altitude) as well as time with an accuracy of up to a few centimetres using time signals transmitted along a line of sight by radio from satellites. [e]

Social networking

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies. MySpace, Facebook, LinkedIn. Also, Digg, Del.icio.us, StumbleUpon, etc.

Telephony on the internet (VOIP)

  • Voice over Internet Protocol [r]: A family of standards that permits carrying voice telephony over Internet Protocol networks that handle both voice and data, instead of dedicated telephony networks. [e]

Virtualization: one computer, many operating systems

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Famous people in history of computing

For now, see this list of people who made conceptual breakthroughs in computer science.

Famous concepts in history of computing

For now, see this list of seminal concepts in computer science.


  1. The Abacus:A Brief History. Retrieved on 2007-04-24.
  2. Kaplan, Erez. 1996. The Controversial Replica of Leonardo da Vinci's Adding Machine. Retrieved on 2007-04-30.
  3. Abernethy, Ken and Allen, Tom. 2004. Early Calculating and Computing Machines: From the Abacus to Babbage. Furman University. Retrieved on 2007-04-30.
  4. A simplified example of the functionality of the Pascaline. La Machine de Pascal:la pascaline (French: The Machine of Pascal: The Pascaline (literal)). Retrieved on 2007-05-04.
  5. Lemelson-MIT Program, Inventor of the Week Archive. MIT. (February 2003). Retrieved on 2007-05-14.
  6. Dunne, Paul E.. History of Computation. Retrieved on 2007-05-14.
  7. O'Connor, J. J. and Robertson, E. F. (July 1999). Hollerith Biography. School of Mathematics and Statistics University of St. Andrews, Scotland. Retrieved on 2007-05-14.
  8. ``A Symbolic Analysis of Relay and Switching Circuits, MIT master's thesis published in T.A.I.E.E. Vol. 57 (1938), pp. 713-723. Transactions American Institute of Electrical Engineers (1938). Retrieved on 2007-05-12.
  9. "Claude Shannon" from Professor Ray C. Dougherty's course notes (V61.0003) Communication: Men, Minds, and Machines (Fall, 1996). Microsoft Corporation (1996). Retrieved on 2007-05-12.
  10. Colossus: The World’s First Electronic Computer. Pico Technology (date_not_specified). Retrieved on 2007-04-24.
  11. The ENIAC Museum Online. University of Pennsylvania School or Engineering and Applied Sciences (SEAS) (date_unspecified). Retrieved on 2007-04-23.
  12. (German) Zuse, Konrad: Verfahren zur selbsttätigen Durchführung von Rechnungen mit Hilfe von Rechenmaschinen. Patentanmeldung Z 23 139 / GMD Nr. 005/021 / Jahr 1936. Konrad Zuse: Bibliography.. Retrieved on 2007-05-16.
  13. Zuse, Horst. The Life and Work of Konrad Zuse. Wimborne Publishing LTD and Maxfield & Montrose Interactive Inc. Retrieved on 2007-05-16.
  14. (1987) "Portraits in Silicon" by Robert Slater, ch. 5, p. 43. The MIT Press. 
  15. IBM Archives:IBM's ASCC (a.k.a. The Harvard Mark I). IBM. Retrieved on 2007-05-15.
  16. Lemelson-MIT Program, Inventor of the Week Archive. MIT. (October 2002). Retrieved on 2007-05-15.
  17. "The Eniac Museum Online", University of Pennsylvania School of Engineering Arts and Sciences. University of Pennsylvania. Retrieved on 2007-05-12.
  18. "First Draft of a Report on the EDVAC" (PDF format) by John von Neumann, Contract No.W-670-ORD-4926, between the United States Army Ordnance Department and the University of Pennsylvania. Moore School of Electrical Engineering, University of Pennsylvania, June 30, 1945. The report is also available in Stern, Nancy (1981). From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers. Digital Press. 
  19. Paul Graham, Java's Cover
  20. Paul Graham, Great Hackers
  21. Paul Graham, The Python Paradox
  22. Vincent Gable, (NS)AppleScript Sucks, IMLocation Development Blog