History of computing: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Robert W King
No edit summary
mNo edit summary
 
(191 intermediate revisions by 14 users not shown)
Line 1: Line 1:
The earliest reference of the term '''computer''' comes from the French word of the same in 1631, derived from the Latin word ''computare'' meaning "to count, to sum up".  The word is formed from the two roots: ''com-'' meaning "with", and ''+putare'' meaning "to reckon"(originally "to prune")<ref>"compute", {{cite web|url=http://www.etymonline.com/index.php?term=compute|title=Online Etymology Dictionary|accessdate=2007-04-24}}</ref>.
{{subpages}}


Before the arrival of ''mechanical'' or ''analogue'' computing, ancient civilizations required methods to quantify properties of their livelihoods.  The earliest counting device, the Slamis Tablet<ref>{{cite web|url=http://www.ee.ryerson.ca:8080/~elf/abacus/history.html|title=The Abacus:A Brief History|accessdate=2007-04-24}}</ref>, was discovered on the island of Salamis in 1846, and was used by the Babylonians to track numbers in their societyOn this board, physical markers(indicators) were placed on the various rows or columns that represented different values.  The indicators were not physically attached to the board.
Writing about the '''history of computing''' is challenging because of the complexity of any one [[computer]], the speed with which computer technology has evolved, and the many different types of computers that have been builtFurther, since computing reaches into so many different industries (such as telephony, automotives, or cameras) and has spawned a huge industry for the making of hardware and software for computers, it is difficult to know where such a history should stop.


Development of counting techniques lead to devices like the Roman hand abacus, which is estimated to have been created some time between 300 B.C. and 500 A.DA notworthy advancement of the hand abacus was the implementation of permanently attached markers, which are adjusted in position to indicate value.  This modification might have contributed to the evolution of the ''suan-pan'', or Chinese abacus, in or around 1200 A.D. 
This article is organized in sequential order, to present the major events in the development of general-purpose computing devices, including the basic evolution of their hardware components and software concepts. The article is organized by decadeTechnologies appear, not in the decade in which they were first imagined or in which experimental versions became available, but in the decade in which they became widely accepted.


The typical modern-day abacus has slidable markers are placed on columns of shafts(typically made from wood or metal) representing powers of ten (.0001, .001, .01, .1, 1, 10, 100 etc), with the top row representing values of "fives" and the bottom representing values of "ones".  These markers are permanently attached to the device.
{{TOC|right}}


It should be noted that usage of an abacus relies on a concept of "states" and place values; that is--whether or not beads are in the "inclusive" or "not-inclusive" positions.  To count items on an abacus, a number of beads are shifted over to the represented position that indicates a counted value, and any that are not moved are not counted.
==Early devices (ancient times) ==
Long before the arrival of ''mechanical'' computing, ancient civilizations devised various methods to calculate and keep track of numbers.  


On 13 February 1967, the "Codex Madrid", written by [[Leonardo Da Vinci]], was discovered in the National Library of Spain in Madrid<ref>Kaplan, Erez.  1996.  {{cite web|url=http://www.webcom.com/calc/leonardo/leonardo.html|title=The Controversial Replica of Leonardo da Vinci's Adding Machine|accessdate=2007-04-30}}</ref>.  Inside the Codex Madrid was a drawing for an elaborate mechanical computational device, found by Dr. Roberto Guatelli, who noted that a similar construct appeared in Da Vinci's "Codex Atlanticus". A prototype of this machine was created in 1968, and was observed that it exhibited traits that of a ratio machine. One revolution of the first shaft(10^1) invoked ten revolutions of the second(10^2), repeating until the last shaft which rotated at a rate of ten to the power of 13.
'''Salamis Tablet (300 B.C.)'''


Whether this was a true computational device was under some debate.  Previously been displayed at IBM, the exhibit was removed due to a nonconsensus, and is presumed to be in one of IBM's storage facilities.  The earliest ''recognized'' mechanical computational device is the Pascaline, created by [[Blaise Pascal]] circa 1642.<ref>Abernethy, Ken and Allen, Tom.  2004. {{cite web|url=http://cs.furman.edu/digitaldomain/focus/history/earlyhist2.html|title=Early Calculating and Computing Machines: From the Abacus to Babbage|publisher=Furman University|accessdate=2007-04-30}}</ref> The Pascaline performed simple addition and subtraction.  The concept of the pascaline came about from the carrying of places by gear rotation.  Functionally, the machine worked by increasing values on a single cog, which ranged from values 0 to 9.  Upon the next rotation, a series of cogs would rotate the next gear over one iteration to read 1 while the first cog would reset back to 0.<ref>A simplified example of the functionality of the Pascaline{{cite web|url=http://perso.orange.fr/therese.eveilleau/pages/truc_mat/textes/pascaline.htm|title=La Machine de Pascal:la pascaline (French: The Machine of Pascal: The Pascaline (literal))|accessdate=2007-05-04}}</ref>
A very early counting device, the [[Salamis Tablet]]<ref>{{cite web|url=http://www.ee.ryerson.ca:8080/~elf/abacus/history.html|title=The Abacus:A Brief History|accessdate=2007-04-24}}</ref>, was used by the Babylonians to track numbers in their society.   


It would take [[Charles Babbage]], born on December 26, 1971 and inducted as a Fellow of the Royal Society to develop the first real successful automatic calculating machine<ref>{{cite web|url=http://web.mit.edu/invent/iow/babbage.html|title=Lemelson-MIT Program, Inventor of the Week Archive|date=February 2003|accessdate=2007-05-14|publisher=MIT.}}</ref>.  In 1821, Babbage developed the Difference Engine No. 1, which was a functional machine designed to compile mathematical tables based on polynomial caculation.<ref>{{cite web|url=http://www.csc.liv.ac.uk/~ped/teachadmin/histsci/htmlform/lect4.html|title=History of Computation|author=Dunne, Paul E.|accessdate=2007-05-14}}</ref>.  The difference engine's physical algorithm was based on a mathematical technique known as the Method of Differences, which Babbage contributed work on.  Unfortunately only a fragment of the machine would ever come to fruitition due to various financial disputes and accusations of fund mismanagement from the British Government.  More importantly, the machine was never fully developed due to Babbage's realization of a more improved machine called the Analytical Engine. 
'''Abacus (300 B.C.)'''


Functionally, the Analytical machine was capable of various algorithmic operations that were broken down into basic algebraic operations.  Two cards would be used to program the system: the first would detail what operations were required to be performed, and the second would contain the values to be operated on.  In this sense, the Analytical Machine was much like a computer, having an input(the algorithm as described on a card), a processor(the machine), an output(the result), and memory(using a storage method--the cards themselves).  Like the pascaline, both the Difference and Analytical Engines relied on series of cogs and gears to compute values.
The [[Abacus|abacus]], a mechanical aid to performing arithmetic, dates back many centuries and is still in use in various forms.


[[Herman Hollerith]] was born on February 29, 1860 in New York.  In 1875 Hollerith attended the City College of New York, he graduated from the Columbia School of Mines in 1879 with an engineering degree.<ref>{{cite web|url=http://www-groups.dcs.st-and.ac.uk/~history/Biographies/Hollerith.html|title=Hollerith Biography|author=O'Connor, J. J. and Robertson, E. F.|date=July 1999|publisher=School of Mathematics and Statistics University of St. Andrews, Scotland|accessdate=2007-05-14}}</ref>  After graduating, Hollerith took up work with the United States Census Bureau, and was appointed Chief Special Agent.  Hollerith's contribution to computing was inspired by his work at the USCB, especially from Dr. John Shaw Billings who suggested that there should be a way to process the large amount of census data by some mechcanical means. 
==Mechanical computing (Renaissance to 1900)==


In 1884, Hollerith worked to develop a way to tabulate census information through the use of punch cards.  Eventually, he recognized that cards could be used as storage medium for census data.  His experiments lead to a process by where a pin would go through a hole in the card to complete an electrical circuit.  His system by which cards could be read and tabulated on a mechanical counter through a circuit completion was called the Hollerith Electric Tabulating System.  By 1890, the machines were improved so that a simple keyboard could be used to tabulate data instead of entry by handHollerith's system was far from automatic, and the development of an almost totally-autonomous system came about on August 7, 1944, when the IBM Automated Sequence Controlled Calculator(ASCC) was shipped and assembled that year to Harvard University, thereby earning its more popular name, the Harvard Mk I.<ref>{{cite web|url=http://www-03.ibm.com/ibm/history/exhibits/markI/markI_intro.html|title=IBM Archives:IBM's ASCC (a.k.a. The Harvard Mark I)|accessdate=2007-05-15|publisher=IBM}}</ref> 
The human longing for mechanical help in performing complex computations existed for a long time before technology was advanced enough to realize a practical solutionThese are some of the attempts to create computing machines before technology was sufficiently advance to make that very feasible.


The idea for this first automatic digital calcuator was conceived in the 1930s by [[Howard H. Aiken]], then a graduate student from Harvard University with a Ph. D. in theoretical physics.  The Mark I had a total of 78 adding machines linked together containing 765,000 parts; 3,300 electrical relays; over 500 miles of wire and more than 175,000 connections.  The machine accepted input via punched cards, paper tape or through manually set switches to indicate the values to be processed.  The output was generated by an electric typewriter or punched into additional cards.  Multiple operations were programmed into the machine with perforated tape.
'''Codex Madrid (Da Vinci)'''


Physically the machine performed calculations through a series of small gears, electro-mechanical counter wheels, and switches. The significant advancement from Hollerith's machine was the automatic feeding of paper encoded values and operations.
On 13 February 1967, the "Codex Madrid", written by [[Leonardo Da Vinci]], was discovered in the [[National Library of Spain]] by Dr. Roberto Guatelli<ref>Kaplan, Erez.  1996.  {{cite web|url=http://www.webcom.com/calc/leonardo/leonardo.html|title=The Controversial Replica of Leonardo da Vinci's Adding Machine|accessdate=2007-04-30}}</ref>.  Inside the Codex Madrid was a drawing for an elaborate mechanical computational device. Guatelli noticed that a similar construct appeared in Da Vinci's "Codex Atlanticus".  A prototype of this machine was created in 1968, and was observed that it exhibited traits that of a ratio machine.  One revolution of the first shaft(10^1) invoked ten revolutions of the second(10^2), repeating until the last shaft which rotated at a rate of ten to the power of 13.  Whether this was a true computational device was under some debate.  Previously been displayed at IBM, the exhibit was removed due to a nonconsensus, and is presumed to be in one of IBM's storage facilities.


The successor to the Mark I, the Mark II, still used relays, but featured an electrical memory and a system of 'constant' values that were referenced during run-time.<ref>{{cite web|url=http://web.mit.edu/invent/iow/aiken.html|date=October 2002|accessdate=2007-05-15|publisher=MIT.|title=Lemelson-MIT Program, Inventor of the Week Archive}}</ref>
'''Pascaline (1642)'''


----
An early mechanical computational device is the [[Pascaline]], created by [[Blaise Pascal]] circa 1642.<ref>Abernethy, Ken and Allen, Tom.  2004. {{cite web|url=http://cs.furman.edu/digitaldomain/focus/history/earlyhist2.html|title=Early Calculating and Computing Machines: From the Abacus to Babbage|publisher=Furman University|accessdate=2007-04-30}}</ref>  The Pascaline performed simple addition and subtraction.  The concept of the Pascaline came about from the carrying of places by gear rotation.  Functionally, the machine worked by increasing values on a single cog, which ranged from values 0 to 9.  Upon the next rotation, a series of cogs would rotate the next gear over one iteration to read 1 while the first cog would reset back to 0.<ref>A simplified example of the functionality of the Pascaline.  {{cite web|url=http://perso.orange.fr/therese.eveilleau/pages/truc_mat/textes/pascaline.htm|title=La Machine de Pascal:la pascaline (French: The Machine of Pascal: The Pascaline (literal))|accessdate=2007-05-04}}</ref>


A necessary precursor to the first electronic computers was the invention of the switching [[Electronic switch#vacuum tube|vacuum tube]], credited to Lee de Forest in 1906.  The ability of vacuum tubes to act as [[Electronic_switch|switches]] (on/off devices that stop or start an electric current) would later be important in the building of the first electronic [[computer|computers]].
'''Charles Babbage (early 1800's)'''


During World War II, the first electronic '''computers''' were developed by the British and U. S. governments as a result of secret military projects.
It would take [[Charles Babbage]], born on December 26, 1791 and inducted as a Fellow of the Royal Society to develop the first real successful automatic calculating machine<ref>{{cite web|url=http://web.mit.edu/invent/iow/babbage.html|title=Lemelson-MIT Program, Inventor of the Week Archive|date=February 2003|accessdate=2007-05-14|publisher=MIT.}}</ref>.  In 1821, Babbage developed the Difference Engine No. 1, which was a functional machine designed to compile mathematical tables based on polynomial calculation.<ref>{{cite web|url=http://www.csc.liv.ac.uk/~ped/teachadmin/histsci/htmlform/lect4.html|title=History of Computation|author=Dunne, Paul E.|accessdate=2007-05-14}}</ref>. The difference engine's physical algorithm was based on a mathematical technique known as the Method of Differences, which Babbage contributed work on.  Unfortunately only a fragment of the machine would ever come to fruition due to various financial disputes and accusations of fund mismanagement from the British Government.  More importantly, the machine was never fully developed due to Babbage's realization of a more advanced machine called the Analytical Engine. Like the Pascaline, both the Difference and Analytical Engines relied on series of cogs and gears to compute values.


Konrad Zuse (1910-1995) is an under-credited but highly fruitful German computer designerWorking in relative isolation in pre-war [[Germany]], Zuse built three prototype electronic computers which computed using the [[binary number system]] and other advanced design concepts. His third model, the Z3, was completed well before any of the computers shown below. However, Zuse was in a chaotic German wartime environment and lacked official support, and all three of the working models were destroyed during [[World War II]]<ref name="Zuse1">{{cite book|url=http://www.amazon.com/Portraits-Silicon-Robert-Slater/dp/0262691310|title="Portraits in Silicon" by Robert Slater, ch. 5, p. 43|publisher=The MIT Press|year=1987}}</ref>.  Despite being drafted into the German army, Zuse survived the war, built another computer in Switzerland, and later was the first designer to propose [[pipelining]] the computations of a computer [[CPU|processor]].  In 1949, Zuse formed Zuse KG, where he worked until 1966.  Zuse KG grew into a leading manufacturer of small scientific computers, employing a thousand people<ref name="Zuse2">{{cite book|url=http://www.amazon.com/Portraits-Silicon-Robert-Slater/dp/0262691310|title="Portraits in Silicon" by Robert Slater, ch. 5, p. 50|publisher=The MIT Press|year=1987}}</ref>.
Functionally, the Analytical Engine was capable of various algorithmic operations that were broken down into basic algebraic operationsTwo cards would be used to program the system: the first would detail what operations were required to be performed, and the second would contain the values to be operated on. In this sense, the Analytical Machine was much like a computer, having an input (the algorithm as described on a card), a processor (the machine), an output (the result), and memory (using a storage method--the cards themselves). Babbage's associate [[Ada Lovelace]] was arguably the first computer programmer, designing algorithms for the Analytical Engine.


Dr. John V. Atanasoff and graduate student Clifford Berry, of Iowa State University, worked on a prototype electronic computer between 1937 and 1942 .  Their work introduced key design ideas which may have been communicated from Atanasoff to Mauchly, who later may have incorporated them into the design of the better-known ENIAC computer.  Some people give Atanasoff credit for creating the very first working electronic computer, although most historic attention has focused on the ENIAC as being the first.
'''Hollerith and punched cards (1884)'''


The highly secret, military Colossus project produced a series of about ten electronic computers used by British codebreakers to read encrypted German messages during World War IIThe Colossus computers used the [[binary number system]] for computationThe Colossus prototype was initially completed by engineer Tommy Flowers in 1943 at the Post Office Research Station, Dollis Hill, with input from mathematician Max Newman and a few othersThe project moved to [[Bletchley Park]] by 1944 and lasted until the end of the war.
[[Herman Hollerith]] was born on February 29, 1860 in New York.  In 1875 Hollerith attended the City College of New York, he graduated from the Columbia School of Mines in 1879 with an engineering degree.<ref>{{cite web|url=http://www-groups.dcs.st-and.ac.uk/~history/Biographies/Hollerith.html|title=Hollerith Biography|author=O'Connor, J. J. and Robertson, E. F.|date=July 1999|publisher=School of Mathematics and Statistics University of St. Andrews, Scotland|accessdate=2007-05-14}}</ref> After graduating, Hollerith took up work with the United States Census Bureau, and was appointed Chief Special AgentHollerith's contribution to computing was inspired by his work at the USCB, especially from Dr. John Shaw Billings who suggested that there should be a way to process the large amount of census data by some mechanical means.  In 1884, Hollerith worked to develop a way to tabulate census information through the use of punch cards.  Eventually, he recognized that cards could be used as storage medium for census dataHis experiments lead to a process by where a pin would go through a hole in the card to complete an electrical circuit.  His system by which cards could be read and tabulated on a mechanical counter through a circuit completion was called the Hollerith Electric Tabulating System.  By 1890, the machines were improved so that a simple keyboard could be used to tabulate data instead of entry by hand.


[http://www.library.upenn.edu/exhibits/rbm/mauchly/jwmintro.html John Mauchly] and [http://americanhistory.si.edu/collections/comphist/eckert.htm J. Presper Eckert] of the University of Pennsylvania proposed the ENIAC (''Electrical Numerical Integrator And Computer'') to the U.S. Army Ordnance Department's Ballistics Research Laboratory in 1943, and then served as its main designers until construction was finished in 1946.  It was a military project justified by a need to compute ballistic trajectories, and was one of the earliest general-purpose, programmable ''electronic'' computers<ref name="Eniac1">{{cite web|url=http://www.seas.upenn.edu/~museum/|title="The Eniac Museum Online", University of Pennsylvania School of Engineering Arts and Sciences|publisher=University of Pennsylvania|accessdate=2007-05-12}}</ref>.
==Prerequisites to the first electronic computers (early 1900's)==


ENIAC performed its computations using the [[decimal number system]], instead of the [[binary number system]] used by most subsequent digital computers.  Also, ENIAC was not yet able to store its own program in memory.  It had to be programmed by setting switches on function tables and by changing the wiring; considerable human effort was required to reprogram it.
'''Switches'''


The designers of ENIAC jointly formed the Eckert-Mauchly Computer Corporation in 1946, which was bought by Remington Rand in 1950.  In 1951, this company delivered the first U. S. commercial computer, called the UNIVAC, to the United States Census Bureau.  It was a stored-program computer, like its non-commercial sister the EDVAC.  Competing fiercely with IBM, the company eventually built 46 of the earliest commercial computer systems.
The discovery of electricity, along with the invention of [[Electronic switch|electronic switches]] in the early 1900's, including the solenoid and the switching vacuum tube, were necessary pre-requisites to the invention of electronic computers.


The EDVAC ((Electronic Discrete Variable Automatic Computer) was a successor to ENIAC, intended to resolve some design difficulties.  It was the first internally stored program computer to be built, a major improvement over the ENIAC.  The U.S. Army Ballistics Research Laboratory funded the development of EDVAC, and it was built at the Aberdeen Proving Ground by the University of Pennsylvania, including ENIAC designers Eckert & Mauchly.  They were joined on the EDVAC design by John von Neumann and some others. 
'''Computational theory'''


The EDVAC realized the stored-program concept first published in von Neumann's 1945 report [http://www.virtualtravelog.net/entries/2003-08-TheFirstDraft.pdf First Draft of a Report on the EDVAC]<ref> [http://www.virtualtravelog.net/entries/2003-08-TheFirstDraft.pdf "First Draft of a Report on the EDVAC"] ([[PDF]] format) by John von Neumann, Contract No.W-670-ORD-4926, between the United States Army Ordnance Department and the [[University of Pennsylvania]]. [[Moore School of Electrical Engineering]], University of Pennsylvania, [[June 30]], [[1945]].  The report is also available in {{cite book| first=Nancy| last=Stern| title=From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers| publisher=Digital Press| year=1981}}</ref>.
Much work on the theory of computation was done in the 1930s. [[Alan Turing]] and [[Alonzo Church]] both came up with new techniques to solve the [[halting problem]], inventing the [[Turing Machine]] and [[lambda calculus]] respectively.


Although its design predates the UNIVAC, the EDVAC did not become fully operational until 1952.
'''Switching algebra'''
 
[[Boolean algebra]], invented by [[George Boole]] in the 1850's, is an algebraic system consisting of only two values, and it was destined to become the basis for describing digital logic circuits used to build electronic computers.  The realization that Boolean algebra could be used to describe logic circuits was a major conceptual breakthrough first documented by [[Claude Shannon]] in his 1937 MIT master's thesis<ref name="Shannon3">{{cite web|url=http://www.research.att.com/~njas/doc/shannonbib.html|title=``A Symbolic Analysis of Relay and Switching Circuits'', MIT master's thesis published in T.A.I.E.E. Vol. 57 (1938), pp. 713-723|publisher= Transactions American Institute of Electrical Engineers|year=1938|accessdate=2007-05-12}}</ref>.  Shannon's thesis created a stir in the world of electronics when it began circulating in 1938<ref name="Shannon1">{{cite web|url=http://www.nyu.edu/pages/linguistics/courses/v610003/shan.html|title="Claude Shannon" from Professor Ray C. Dougherty's course notes (V61.0003) Communication: Men, Minds, and Machines (Fall, 1996)|publisher=[[Microsoft Corporation]]|year=1996|accessdate=2007-05-12}}</ref>, though Shannon was better known in later years for founding the field of [[information theory]].  Boolean algebra subsequently became known as "switching algebra" by computer designers.
 
==1940s: The first electronic computers==
 
Prior to World War II, the word ''computer'' generally meant a person who computes.  But in the early 1940's (during World War II), the first electronic '''[[Computer|computers]]''' were developed to perform numerical calculations far faster than humans could.  With the notable exception of the Zuse Z3, which was developed in Germany in relative isolation, the first electronic computers were mainly the result of secret military projects funded by the British and U. S. governments<ref name="Colossus">{{cite web|url=http://www.picotech.com/applications/colossus.html|title=Colossus: The World’s First Electronic Computer|publisher=Pico Technology|year=date_not_specified|accessdate=2007-04-24}}</ref><ref name="Eniac">{{cite web|url=http://www.seas.upenn.edu/~museum/|title=The ENIAC Museum Online|publisher=University of Pennsylvania School or Engineering and Applied Sciences (SEAS)|year=date_unspecified|accessdate=2007-04-23}}</ref>.
 
'''Zuse Z3 (1941)'''
 
German engineer [[Konrad Zuse]] built the Z1 computer between 1936 and 1938.  German patent applications provide evidence of Zuse's development of a mechanical memory device in 1936, used in the Z1.<ref>(German) Zuse, Konrad: Verfahren zur selbsttätigen Durchführung von Rechnungen mit Hilfe von Rechenmaschinen. Patentanmeldung Z 23 139 / GMD Nr. 005/021 / Jahr 1936. {{cite web|url=http://www.epemag.com/zuse/Bgraphy.htm#ZUSE36|title=Konrad Zuse: Bibliography.|accessdate=2007-05-16}}</ref>.  Zuse built the Z2 sometime between 1936 and 1939, and the Z3 from 1939 to 1941<ref>{{cite web|url=http://www.epemag.com/zuse/|title=The Life and Work of Konrad Zuse|accessdate=2007-05-16|publisher=Wimborne Publishing LTD and Maxfield & Montrose Interactive Inc|author=Zuse, Horst}}</ref>.  The German government was not supportive of Zuse's work, and evidence of the Z3's existence was discovered by Allied forces only after the end of World War II.  All photographs of the Zuse Z3 were destroyed by allied raids during the war<ref name="Zuse1">{{cite book|url=http://www.amazon.com/Portraits-Silicon-Robert-Slater/dp/0262691310|title="Portraits in Silicon" by Robert Slater, ch. 5, p. 43|publisher=The MIT Press|year=1987}}</ref>. Zuse's constructions incorporated advanced concepts, including the implementation of the [[binary numeral system]].  Having survived the war, Zuse built another computer in Switzerland, and later was the first designer to propose [[pipelining]] the computations of a computer [[CPU|processor]]. In 1949, Zuse formed Zuse KG, where he worked until 1966.  Zuse's Z3 is now recognized as probably having been the first general-purpose electro-mechanical computer.
 
'''Atanasoff pre-computer (1942)'''
 
Between 1937 and 1942, Dr. John V. Atanasoff and graduate student Clifford Berry, of Iowa State University, worked on a prototype electronic computer that introduced key design ideas but which never completely realized as a general purpose computing device.  Some of Atanasoff's ideas may have been communicated to [http://www.library.upenn.edu/exhibits/rbm/mauchly/jwmintro.html John Mauchly], who later assisted in the development of the ENIAC.
 
'''Colossus (1943)'''
 
Britain's [[Colossus project]] produced a series of about ten electronic computers used by British code breakers to read encrypted German messages during World War II.  The first Colossus prototype was initially completed by engineer [[Tommy Flowers]] in 1943 at the Post Office Research Station, [[Dollis Hill]], with input from mathematician Max Newman and a few others.  It used the [[binary numeral system]] for calculations, utilizing vacuum tubes and very fast optical punch tape readers.  By 1944, the project moved to [[Bletchley Park]] and lasted until the end of the war.  Shortly after, in 1946, Winston Churchill gave official orders to have the machines destroyed
 
'''Harvard Mark I (1944)'''
 
For decades after World War II, it was widely believed that the IBM Automated Sequence Controlled Calculator(ASCC) (completed in 1944 and later called the Mark I) was the first electromechanical general-purpose computer<ref>{{cite web|url=http://www-03.ibm.com/ibm/history/exhibits/markI/markI_intro.html|title=IBM Archives:IBM's ASCC (a.k.a. The Harvard Mark I)|accessdate=2007-05-15|publisher=IBM}}</ref>.  The idea for the Harvard Mark I automatic digital calculator was conceived by [[Howard H. Aiken]], then a graduate student from Harvard University with a Ph. D. in theoretical physics.  The machine was a hybrid of mechanical and electronic technology, performing calculations through a series of small gears, electro-mechanical counter wheels, and switches. Input occurred via punched cards, paper tape or through manually set switches to indicate the values to be processed.  The output was generated by an electric typewriter or punched into additional cards.  The successor to the Mark I, the Mark II, would still used relays, but also featured an electrical memory and a system of 'constant' values that were referenced during run-time.<ref>{{cite web|url=http://web.mit.edu/invent/iow/aiken.html|date=October 2002|accessdate=2007-05-15|publisher=MIT.|title=Lemelson-MIT Program, Inventor of the Week Archive}}</ref>
 
'''ENIAC (1946)'''
 
At the University of Pennsylvania, [http://www.library.upenn.edu/exhibits/rbm/mauchly/jwmintro.html John Mauchly] and [http://americanhistory.si.edu/collections/comphist/eckert.htm J. Presper Eckert] proposed the ''Electrical Numerical Integrator And Computer'' to the U.S. Army Ordnance Department's Ballistics Research Laboratory in 1943, and then served as its main designers until construction was finished in 1946.  It was a military project justified by a need to compute ballistic trajectories, and was one of the earliest general-purpose, programmable ''electronic'' computers<ref name="Eniac1">{{cite web|url=http://www.seas.upenn.edu/~museum/|title="The Eniac Museum Online", University of Pennsylvania School of Engineering Arts and Sciences|publisher=University of Pennsylvania|accessdate=2007-05-12}}</ref>.  ENIAC's computations used the [[decimal numeral system]], instead of the [[binary numeral system]] used by most subsequent digital computers. The ENIAC was not able to store its own program in memory; it had to be programmed by setting switches on function tables and by changing the wiring.  Considerable human effort was required to reprogram it.
 
'''UNIVAC and EDVAC (late 1940's)'''
 
The designers of ENIAC jointly formed the Eckert-Mauchly Computer Corporation in 1946, which was bought by Remington Rand in 1950.  In 1951, this company delivered the first U. S. commercial computer, called the [[UNIVAC]], to the United States Census Bureau.  The UNIVAC was a [[stored-program]] computer, like its non-commercial sister the [[EDVAC]].  The Electronic Discrete Variable Automatic Computer was funded by the U.S. Army Ballistics Research Laboratory and was built at the Aberdeen Proving Ground in Maryland.  EDVAC was build by the University of Pennsylvania's ENIAC designers Eckert and Mauchly, and by John von Neumann and some others.  The EDVAC was the first computer to implement the stored-program concept.  The idea was first published in von Neumann's 1945 report [http://www.virtualtravelog.net/entries/2003-08-TheFirstDraft.pdf First Draft of a Report on the EDVAC]<ref> [http://www.virtualtravelog.net/entries/2003-08-TheFirstDraft.pdf "First Draft of a Report on the EDVAC"] ([[PDF]] format) by John von Neumann, Contract No.W-670-ORD-4926, between the United States Army Ordnance Department and the [[University of Pennsylvania]]. [[Moore School of Electrical Engineering]], University of Pennsylvania, [[June 30]], 1945.  The report is also available in {{cite book| first=Nancy| last=Stern| title=From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers| publisher=Digital Press| year=1981}}</ref>.  Although its design predates the UNIVAC, the EDVAC did not become fully operational until 1952.  Competing fiercely with IBM, the company eventually built 46 of the earliest commercial computer systems.
 
==1950s: Hardware and the first compiler==
 
In the decades after World War II, computers grew rapidly in usefulness while decreasing in size and cost, spawning a huge and complex industry to create hardware and software for them.  But in the 1950's, they were still huge and expensive and available only to a few people.
 
'''Assemblers'''
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
 
'''First compiler'''
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
 
==1960s: Batch operating systems==
 
The 1960s brought an era of commercial corporate use of computers for the first time. Groups of people used [[punched cards]] or [[paper tape]] to write programs, and then a [[computer operator]] would take the pile of stacked cards that build up and submit them as individual jobs to a large computer.
 
'''Batch operating systems'''
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
 
'''Multics'''
 
[http://multicians.org Multicians.org] has a huge amount of info waiting to be mined on [[Multics]]
 
==1970s: Networks, better software, and smaller hardware==
 
===DARPA and the early networks===
 
Some of the critical key concepts included the concepts of [[packet switching]] from [[Paul Baran]] and [[Leonard Kleinrock]], and of interconnected heterogeneous networks, first called '''catenets''', by [[Louis Pouzin]] and [[Vinton Cerf]], with [[J.R. Licklider]] providing project direction and government support.  These concepts led, by the 1980's, to the [[Internet]].
 
===Time-sharing operating systems: the birth of UNIX and C===
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
 
===Mini-computers: hardware gets smaller===
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
 
=== Computer programming languages ===
During the 1970s, a variety of new programming languages emerged - the [[C programming language]] being the one which is still widely used in system programming. In addition, a variety of higher level languages emerged: Charles Moore's [[Forth programming language|Forth]], the logic programming language [[Prolog programming language|Prolog]], and the beginning of the competitive process to develop [[Ada programming language|Ada]] was started, which would later become the language used for military and aviation programming. The Lisp dialect, [[Scheme programming language|Scheme]] was also created, which would later become the language used in [[Structure and Interpretation of Computer Programs]], the introductory programming course for [[Massachusetts Institute of Technology|MIT]] undergraduates.
 
Many programming languages also became more standardized, with the [[American National Standards Institute]] (ANSI) publishing specifications for COBOL, FORTRAN 77 and MUMPS.
 
==1980s: Networks and personal computers, oh my!==
 
===personal computers===
 
The "big three" (Apple, Commodore and Tandy-Radio Shack) duke it out with the [[IBM compatible PC]] and [[Microsoft MS-DOS]]
 
===The internet===
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.  ''(Note: HTTP / the Web is specifically excluded from this decade, because the WWW came later)''
 
===DNS===
===electronic mail===
 
===newsgroups===
 
===TCP applications (FTP, gopher, telnet, WAIS, Archie / Veronica, etc)===
 
==1990s: WWW, and several software revolutions==
 
===Dial-up internet access for the masses===
 
Many small companies, and some large ones such as [[CompuServe]], sell dial-up internet access accounts to the public.  Suddenly, anyone could get online and have email, although email systems were not always compatible with each other.  Combined with the availability of home (personal) computers, internet usage grew astronomically.
 
===The World Wide Web===
 
In 1992, Tim Berners-Lee published the [[HTTP]] protocol and the [[HTML]] language, and the first [[web browser]] (Mosaic) became available.  These led to what is now called the [[World Wide Web]] (or just WWW).
 
===Object-oriented programming goes wild===
 
Experimental computer programming languages based on object-oriented concepts were developed decades earlier, but in 1992, [[James Gosling]] and his team at [[Sun Microsystems]] began developing its [[Java platform]] which soon took the programming world by storm. Java provided an alternative to C++ for many, and is now widely used for programming on both the server and the client, as well as being taught as the first language for many students on computer science programs. Java is now a mainstay for enterprise computing. Java, like many languages before and since, uses a [[virtual machine]] architecture - the programmer writes in the Java programming language, which compiles to bytecode, which itself runs on a virtual machine that abstracts away the underlying platform. The idea is that it supposedly makes it simpler to write cross-platform applications, although in practice, this is rarely as easy as the early Java hype made it sound. Java's popularity spread widely, with Java variants appearing on mobile phones ([[JavaME]]), the enterprise server ([[JavaEE]]) and on web pages (the brief phenomena of Java applets) - to the point where some people started criticising Java's dominance<ref>Paul Graham, [http://www.paulgraham.com/javacover.html Java's Cover]</ref><ref>Paul Graham, [http://www.paulgraham.com/gh.html Great Hackers]</ref><ref>Paul Graham, [http://www.paulgraham.com/pypar.html The Python Paradox]</ref>.
 
There are also a wide variety of new programming languages appearing during the 1990s, often aimed at absolute beginners. Microsoft created [[Visual Basic]] which worked both on it's own and as part of the Office applications. 'Scripting' languages also became extremely popular: the Macintosh platform had [[AppleScript]] introduced, which has a "natural language" syntax designed for non-programmers (and which many programmers find cumbersome<ref>Vincent Gable, [http://imlocation.wordpress.com/2007/09/24/nsapplescript-sucks/ (NS)AppleScript Sucks], IMLocation Development Blog</ref>). [[Python programming language|Python]] was a significant development, providing a simple scripting language that anyone could start using, but which had advanced object-oriented features. [[Ruby programmming language|Ruby]] became popular later, offering some of the same benefits.
 
=== Open-source software shakes things up ===
 
After a decade of failed attempts to make the popular [[Unix operating system]] run on low-cost Wintel hardware, Linus Torvalds released the [[Linux operating system]].  Seeds planted in the 1980's by Richard Stallman's [[Open Software Foundation]] finally took root, and the [[open source software]] movement really took off, having a disruptive (and arguably positive) effect on the entire software economy.
 
=== XML, the universal translator ===
 
Another step on the road to operating system (platform) independence, this text-based standard for self-describing data had a huge impact.  As major programming languages such as [[Java]] and [[C#]] developed XML parsers, these languages could finally exchange information with each other, and much more.
 
=== MP3: Honey, I shrunk the music ===
 
This compression algorithm led to digital music players and file sharing and much more.  (need more here!)
 
==after 2000==
{{seealso|Convergence of communications}}
'''Wireless networking'''
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.\
{{r|Locality of networks|Wireless local area network}}
{{r|Cellular telephony}}
{{r|Satellite communications}}
 
'''Google, Maps, Mashups, Amazon, Yahoo'''
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
{{r|Mashup}}
 
'''Global Positioning System (GPS)'''
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.
{{r|Global Navigation Satellite System}}
 
'''Social networking'''
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.  MySpace, Facebook, LinkedIn.  Also, Digg, Del.icio.us, StumbleUpon, etc.
 
'''Telephony on the internet (VOIP)'''
 
{{r|Voice over Internet Protocol}}
 
'''Virtualization: one computer, many operating systems'''
 
Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.


==Famous people in history of computing==
==Famous people in history of computing==
For now, see this [[list of people who made conceptual breakthroughs in computer science]].
For now, see this [[Computer_science/Catalogs/Breakthroughs|list of people who made conceptual breakthroughs in computer science]].


==Famous concepts in history of computing==
==Famous concepts in history of computing==
For now, see this [[list of seminal concepts in computer science]].
For now, see this [[Computer_science/Catalogs/List_of_seminal_concepts_in_computer_science|list of seminal concepts in computer science]].
 
==External links==
* [http://www.computerhistory.org/ The Computer History Museum]
* [http://www.cbi.umn.edu/ Charles Babbage Institute: Center for History of Information Technology] at the University of Minnesota


==References==
==References==
<references/>
{{reflist|2}}[[Category:Suggestion Bot Tag]]
 
[[Category:CZ Live]]
[[Category:Computers Workgroup]]
[[Category:History Workgroup]]

Latest revision as of 11:00, 28 August 2024

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Writing about the history of computing is challenging because of the complexity of any one computer, the speed with which computer technology has evolved, and the many different types of computers that have been built. Further, since computing reaches into so many different industries (such as telephony, automotives, or cameras) and has spawned a huge industry for the making of hardware and software for computers, it is difficult to know where such a history should stop.

This article is organized in sequential order, to present the major events in the development of general-purpose computing devices, including the basic evolution of their hardware components and software concepts. The article is organized by decade. Technologies appear, not in the decade in which they were first imagined or in which experimental versions became available, but in the decade in which they became widely accepted.

Early devices (ancient times)

Long before the arrival of mechanical computing, ancient civilizations devised various methods to calculate and keep track of numbers.

Salamis Tablet (300 B.C.)

A very early counting device, the Salamis Tablet[1], was used by the Babylonians to track numbers in their society.

Abacus (300 B.C.)

The abacus, a mechanical aid to performing arithmetic, dates back many centuries and is still in use in various forms.

Mechanical computing (Renaissance to 1900)

The human longing for mechanical help in performing complex computations existed for a long time before technology was advanced enough to realize a practical solution. These are some of the attempts to create computing machines before technology was sufficiently advance to make that very feasible.

Codex Madrid (Da Vinci)

On 13 February 1967, the "Codex Madrid", written by Leonardo Da Vinci, was discovered in the National Library of Spain by Dr. Roberto Guatelli[2]. Inside the Codex Madrid was a drawing for an elaborate mechanical computational device. Guatelli noticed that a similar construct appeared in Da Vinci's "Codex Atlanticus". A prototype of this machine was created in 1968, and was observed that it exhibited traits that of a ratio machine. One revolution of the first shaft(10^1) invoked ten revolutions of the second(10^2), repeating until the last shaft which rotated at a rate of ten to the power of 13. Whether this was a true computational device was under some debate. Previously been displayed at IBM, the exhibit was removed due to a nonconsensus, and is presumed to be in one of IBM's storage facilities.

Pascaline (1642)

An early mechanical computational device is the Pascaline, created by Blaise Pascal circa 1642.[3] The Pascaline performed simple addition and subtraction. The concept of the Pascaline came about from the carrying of places by gear rotation. Functionally, the machine worked by increasing values on a single cog, which ranged from values 0 to 9. Upon the next rotation, a series of cogs would rotate the next gear over one iteration to read 1 while the first cog would reset back to 0.[4]

Charles Babbage (early 1800's)

It would take Charles Babbage, born on December 26, 1791 and inducted as a Fellow of the Royal Society to develop the first real successful automatic calculating machine[5]. In 1821, Babbage developed the Difference Engine No. 1, which was a functional machine designed to compile mathematical tables based on polynomial calculation.[6]. The difference engine's physical algorithm was based on a mathematical technique known as the Method of Differences, which Babbage contributed work on. Unfortunately only a fragment of the machine would ever come to fruition due to various financial disputes and accusations of fund mismanagement from the British Government. More importantly, the machine was never fully developed due to Babbage's realization of a more advanced machine called the Analytical Engine. Like the Pascaline, both the Difference and Analytical Engines relied on series of cogs and gears to compute values.

Functionally, the Analytical Engine was capable of various algorithmic operations that were broken down into basic algebraic operations. Two cards would be used to program the system: the first would detail what operations were required to be performed, and the second would contain the values to be operated on. In this sense, the Analytical Machine was much like a computer, having an input (the algorithm as described on a card), a processor (the machine), an output (the result), and memory (using a storage method--the cards themselves). Babbage's associate Ada Lovelace was arguably the first computer programmer, designing algorithms for the Analytical Engine.

Hollerith and punched cards (1884)

Herman Hollerith was born on February 29, 1860 in New York. In 1875 Hollerith attended the City College of New York, he graduated from the Columbia School of Mines in 1879 with an engineering degree.[7] After graduating, Hollerith took up work with the United States Census Bureau, and was appointed Chief Special Agent. Hollerith's contribution to computing was inspired by his work at the USCB, especially from Dr. John Shaw Billings who suggested that there should be a way to process the large amount of census data by some mechanical means. In 1884, Hollerith worked to develop a way to tabulate census information through the use of punch cards. Eventually, he recognized that cards could be used as storage medium for census data. His experiments lead to a process by where a pin would go through a hole in the card to complete an electrical circuit. His system by which cards could be read and tabulated on a mechanical counter through a circuit completion was called the Hollerith Electric Tabulating System. By 1890, the machines were improved so that a simple keyboard could be used to tabulate data instead of entry by hand.

Prerequisites to the first electronic computers (early 1900's)

Switches

The discovery of electricity, along with the invention of electronic switches in the early 1900's, including the solenoid and the switching vacuum tube, were necessary pre-requisites to the invention of electronic computers.

Computational theory

Much work on the theory of computation was done in the 1930s. Alan Turing and Alonzo Church both came up with new techniques to solve the halting problem, inventing the Turing Machine and lambda calculus respectively.

Switching algebra

Boolean algebra, invented by George Boole in the 1850's, is an algebraic system consisting of only two values, and it was destined to become the basis for describing digital logic circuits used to build electronic computers. The realization that Boolean algebra could be used to describe logic circuits was a major conceptual breakthrough first documented by Claude Shannon in his 1937 MIT master's thesis[8]. Shannon's thesis created a stir in the world of electronics when it began circulating in 1938[9], though Shannon was better known in later years for founding the field of information theory. Boolean algebra subsequently became known as "switching algebra" by computer designers.

1940s: The first electronic computers

Prior to World War II, the word computer generally meant a person who computes. But in the early 1940's (during World War II), the first electronic computers were developed to perform numerical calculations far faster than humans could. With the notable exception of the Zuse Z3, which was developed in Germany in relative isolation, the first electronic computers were mainly the result of secret military projects funded by the British and U. S. governments[10][11].

Zuse Z3 (1941)

German engineer Konrad Zuse built the Z1 computer between 1936 and 1938. German patent applications provide evidence of Zuse's development of a mechanical memory device in 1936, used in the Z1.[12]. Zuse built the Z2 sometime between 1936 and 1939, and the Z3 from 1939 to 1941[13]. The German government was not supportive of Zuse's work, and evidence of the Z3's existence was discovered by Allied forces only after the end of World War II. All photographs of the Zuse Z3 were destroyed by allied raids during the war[14]. Zuse's constructions incorporated advanced concepts, including the implementation of the binary numeral system. Having survived the war, Zuse built another computer in Switzerland, and later was the first designer to propose pipelining the computations of a computer processor. In 1949, Zuse formed Zuse KG, where he worked until 1966. Zuse's Z3 is now recognized as probably having been the first general-purpose electro-mechanical computer.

Atanasoff pre-computer (1942)

Between 1937 and 1942, Dr. John V. Atanasoff and graduate student Clifford Berry, of Iowa State University, worked on a prototype electronic computer that introduced key design ideas but which never completely realized as a general purpose computing device. Some of Atanasoff's ideas may have been communicated to John Mauchly, who later assisted in the development of the ENIAC.

Colossus (1943)

Britain's Colossus project produced a series of about ten electronic computers used by British code breakers to read encrypted German messages during World War II. The first Colossus prototype was initially completed by engineer Tommy Flowers in 1943 at the Post Office Research Station, Dollis Hill, with input from mathematician Max Newman and a few others. It used the binary numeral system for calculations, utilizing vacuum tubes and very fast optical punch tape readers. By 1944, the project moved to Bletchley Park and lasted until the end of the war. Shortly after, in 1946, Winston Churchill gave official orders to have the machines destroyed

Harvard Mark I (1944)

For decades after World War II, it was widely believed that the IBM Automated Sequence Controlled Calculator(ASCC) (completed in 1944 and later called the Mark I) was the first electromechanical general-purpose computer[15]. The idea for the Harvard Mark I automatic digital calculator was conceived by Howard H. Aiken, then a graduate student from Harvard University with a Ph. D. in theoretical physics. The machine was a hybrid of mechanical and electronic technology, performing calculations through a series of small gears, electro-mechanical counter wheels, and switches. Input occurred via punched cards, paper tape or through manually set switches to indicate the values to be processed. The output was generated by an electric typewriter or punched into additional cards. The successor to the Mark I, the Mark II, would still used relays, but also featured an electrical memory and a system of 'constant' values that were referenced during run-time.[16]

ENIAC (1946)

At the University of Pennsylvania, John Mauchly and J. Presper Eckert proposed the Electrical Numerical Integrator And Computer to the U.S. Army Ordnance Department's Ballistics Research Laboratory in 1943, and then served as its main designers until construction was finished in 1946. It was a military project justified by a need to compute ballistic trajectories, and was one of the earliest general-purpose, programmable electronic computers[17]. ENIAC's computations used the decimal numeral system, instead of the binary numeral system used by most subsequent digital computers. The ENIAC was not able to store its own program in memory; it had to be programmed by setting switches on function tables and by changing the wiring. Considerable human effort was required to reprogram it.

UNIVAC and EDVAC (late 1940's)

The designers of ENIAC jointly formed the Eckert-Mauchly Computer Corporation in 1946, which was bought by Remington Rand in 1950. In 1951, this company delivered the first U. S. commercial computer, called the UNIVAC, to the United States Census Bureau. The UNIVAC was a stored-program computer, like its non-commercial sister the EDVAC. The Electronic Discrete Variable Automatic Computer was funded by the U.S. Army Ballistics Research Laboratory and was built at the Aberdeen Proving Ground in Maryland. EDVAC was build by the University of Pennsylvania's ENIAC designers Eckert and Mauchly, and by John von Neumann and some others. The EDVAC was the first computer to implement the stored-program concept. The idea was first published in von Neumann's 1945 report First Draft of a Report on the EDVAC[18]. Although its design predates the UNIVAC, the EDVAC did not become fully operational until 1952. Competing fiercely with IBM, the company eventually built 46 of the earliest commercial computer systems.

1950s: Hardware and the first compiler

In the decades after World War II, computers grew rapidly in usefulness while decreasing in size and cost, spawning a huge and complex industry to create hardware and software for them. But in the 1950's, they were still huge and expensive and available only to a few people.

Assemblers

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

First compiler

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

1960s: Batch operating systems

The 1960s brought an era of commercial corporate use of computers for the first time. Groups of people used punched cards or paper tape to write programs, and then a computer operator would take the pile of stacked cards that build up and submit them as individual jobs to a large computer.

Batch operating systems

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Multics

Multicians.org has a huge amount of info waiting to be mined on Multics

1970s: Networks, better software, and smaller hardware

DARPA and the early networks

Some of the critical key concepts included the concepts of packet switching from Paul Baran and Leonard Kleinrock, and of interconnected heterogeneous networks, first called catenets, by Louis Pouzin and Vinton Cerf, with J.R. Licklider providing project direction and government support. These concepts led, by the 1980's, to the Internet.

Time-sharing operating systems: the birth of UNIX and C

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Mini-computers: hardware gets smaller

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Computer programming languages

During the 1970s, a variety of new programming languages emerged - the C programming language being the one which is still widely used in system programming. In addition, a variety of higher level languages emerged: Charles Moore's Forth, the logic programming language Prolog, and the beginning of the competitive process to develop Ada was started, which would later become the language used for military and aviation programming. The Lisp dialect, Scheme was also created, which would later become the language used in Structure and Interpretation of Computer Programs, the introductory programming course for MIT undergraduates.

Many programming languages also became more standardized, with the American National Standards Institute (ANSI) publishing specifications for COBOL, FORTRAN 77 and MUMPS.

1980s: Networks and personal computers, oh my!

personal computers

The "big three" (Apple, Commodore and Tandy-Radio Shack) duke it out with the IBM compatible PC and Microsoft MS-DOS

The internet

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies. (Note: HTTP / the Web is specifically excluded from this decade, because the WWW came later)

DNS

electronic mail

newsgroups

TCP applications (FTP, gopher, telnet, WAIS, Archie / Veronica, etc)

1990s: WWW, and several software revolutions

Dial-up internet access for the masses

Many small companies, and some large ones such as CompuServe, sell dial-up internet access accounts to the public. Suddenly, anyone could get online and have email, although email systems were not always compatible with each other. Combined with the availability of home (personal) computers, internet usage grew astronomically.

The World Wide Web

In 1992, Tim Berners-Lee published the HTTP protocol and the HTML language, and the first web browser (Mosaic) became available. These led to what is now called the World Wide Web (or just WWW).

Object-oriented programming goes wild

Experimental computer programming languages based on object-oriented concepts were developed decades earlier, but in 1992, James Gosling and his team at Sun Microsystems began developing its Java platform which soon took the programming world by storm. Java provided an alternative to C++ for many, and is now widely used for programming on both the server and the client, as well as being taught as the first language for many students on computer science programs. Java is now a mainstay for enterprise computing. Java, like many languages before and since, uses a virtual machine architecture - the programmer writes in the Java programming language, which compiles to bytecode, which itself runs on a virtual machine that abstracts away the underlying platform. The idea is that it supposedly makes it simpler to write cross-platform applications, although in practice, this is rarely as easy as the early Java hype made it sound. Java's popularity spread widely, with Java variants appearing on mobile phones (JavaME), the enterprise server (JavaEE) and on web pages (the brief phenomena of Java applets) - to the point where some people started criticising Java's dominance[19][20][21].

There are also a wide variety of new programming languages appearing during the 1990s, often aimed at absolute beginners. Microsoft created Visual Basic which worked both on it's own and as part of the Office applications. 'Scripting' languages also became extremely popular: the Macintosh platform had AppleScript introduced, which has a "natural language" syntax designed for non-programmers (and which many programmers find cumbersome[22]). Python was a significant development, providing a simple scripting language that anyone could start using, but which had advanced object-oriented features. Ruby became popular later, offering some of the same benefits.

Open-source software shakes things up

After a decade of failed attempts to make the popular Unix operating system run on low-cost Wintel hardware, Linus Torvalds released the Linux operating system. Seeds planted in the 1980's by Richard Stallman's Open Software Foundation finally took root, and the open source software movement really took off, having a disruptive (and arguably positive) effect on the entire software economy.

XML, the universal translator

Another step on the road to operating system (platform) independence, this text-based standard for self-describing data had a huge impact. As major programming languages such as Java and C# developed XML parsers, these languages could finally exchange information with each other, and much more.

MP3: Honey, I shrunk the music

This compression algorithm led to digital music players and file sharing and much more. (need more here!)

after 2000

See also: Convergence of communications

Wireless networking

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.\

Google, Maps, Mashups, Amazon, Yahoo

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

  • Mashup [r]: A data visualization created by combining data with multiple computer applications. [e]

Global Positioning System (GPS)

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

  • Global Navigation Satellite System [r]: A system which allows small electronic devices to determine their location (Longitude, Latitude, and Altitude) as well as time with an accuracy of up to a few centimetres using time signals transmitted along a line of sight by radio from satellites. [e]

Social networking

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies. MySpace, Facebook, LinkedIn. Also, Digg, Del.icio.us, StumbleUpon, etc.

Telephony on the internet (VOIP)

Virtualization: one computer, many operating systems

Need a paragraph here with pointers to existing or future articles about specific project, products, companies or technologies.

Famous people in history of computing

For now, see this list of people who made conceptual breakthroughs in computer science.

Famous concepts in history of computing

For now, see this list of seminal concepts in computer science.

References

  1. The Abacus:A Brief History. Retrieved on 2007-04-24.
  2. Kaplan, Erez. 1996. The Controversial Replica of Leonardo da Vinci's Adding Machine. Retrieved on 2007-04-30.
  3. Abernethy, Ken and Allen, Tom. 2004. Early Calculating and Computing Machines: From the Abacus to Babbage. Furman University. Retrieved on 2007-04-30.
  4. A simplified example of the functionality of the Pascaline. La Machine de Pascal:la pascaline (French: The Machine of Pascal: The Pascaline (literal)). Retrieved on 2007-05-04.
  5. Lemelson-MIT Program, Inventor of the Week Archive. MIT. (February 2003). Retrieved on 2007-05-14.
  6. Dunne, Paul E.. History of Computation. Retrieved on 2007-05-14.
  7. O'Connor, J. J. and Robertson, E. F. (July 1999). Hollerith Biography. School of Mathematics and Statistics University of St. Andrews, Scotland. Retrieved on 2007-05-14.
  8. ``A Symbolic Analysis of Relay and Switching Circuits, MIT master's thesis published in T.A.I.E.E. Vol. 57 (1938), pp. 713-723. Transactions American Institute of Electrical Engineers (1938). Retrieved on 2007-05-12.
  9. "Claude Shannon" from Professor Ray C. Dougherty's course notes (V61.0003) Communication: Men, Minds, and Machines (Fall, 1996). Microsoft Corporation (1996). Retrieved on 2007-05-12.
  10. Colossus: The World’s First Electronic Computer. Pico Technology (date_not_specified). Retrieved on 2007-04-24.
  11. The ENIAC Museum Online. University of Pennsylvania School or Engineering and Applied Sciences (SEAS) (date_unspecified). Retrieved on 2007-04-23.
  12. (German) Zuse, Konrad: Verfahren zur selbsttätigen Durchführung von Rechnungen mit Hilfe von Rechenmaschinen. Patentanmeldung Z 23 139 / GMD Nr. 005/021 / Jahr 1936. Konrad Zuse: Bibliography.. Retrieved on 2007-05-16.
  13. Zuse, Horst. The Life and Work of Konrad Zuse. Wimborne Publishing LTD and Maxfield & Montrose Interactive Inc. Retrieved on 2007-05-16.
  14. (1987) "Portraits in Silicon" by Robert Slater, ch. 5, p. 43. The MIT Press. 
  15. IBM Archives:IBM's ASCC (a.k.a. The Harvard Mark I). IBM. Retrieved on 2007-05-15.
  16. Lemelson-MIT Program, Inventor of the Week Archive. MIT. (October 2002). Retrieved on 2007-05-15.
  17. "The Eniac Museum Online", University of Pennsylvania School of Engineering Arts and Sciences. University of Pennsylvania. Retrieved on 2007-05-12.
  18. "First Draft of a Report on the EDVAC" (PDF format) by John von Neumann, Contract No.W-670-ORD-4926, between the United States Army Ordnance Department and the University of Pennsylvania. Moore School of Electrical Engineering, University of Pennsylvania, June 30, 1945. The report is also available in Stern, Nancy (1981). From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers. Digital Press. 
  19. Paul Graham, Java's Cover
  20. Paul Graham, Great Hackers
  21. Paul Graham, The Python Paradox
  22. Vincent Gable, (NS)AppleScript Sucks, IMLocation Development Blog