Computer: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Pat Palmer
(moving wads of "how computers work" to Computer architecture)
mNo edit summary
 
(127 intermediate revisions by 20 users not shown)
Line 1: Line 1:
During World War II, the first '''computer'''s (electronic machines that perform numerical calculations far faster than humans) were developed by the British and U. S. governments as a result of secret military projects. These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost.   Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word '''computer''' may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities.
{{subpages}}
 
{{TOC|right}}
The [[history of computing]] is complex.  The desire for computers had existed for a long time, but technology was net yet advanced enough to realize them.  People had hankered after mechanical devices to help with mathematical calculations, inventing the abacus, the slide rule, and a host of mechanical adding machines.  But the electronic computer's rapid evolution forever changed science and technology, the military, and the business world, making its invention a milestone for humanity on a par with the invention of the printing press.
The electronic [[computer]], dating from the middle of the twentieth century, vastly expanded human ability to store and share [[information]]. As such, the invention of the computer may be a milestone for humanity on a par with the advent of [[writing]] and materials to write on (millennia ago)<ref name="Paper">{{cite web|url=http://www.wipapercouncil.org/invention.htm|title=The Invention of Paper Copyright © 2004 Wisconsin Paper Council|year=2004|accessdate=2007-04-24}}</ref>, or with the invention of the [[printing press]] (~1450)<ref name="PrintingPress">{{cite web|url=http://www.historyguide.org/intellect/press.html|title=The Printing Press by The History Guide copyright © 2000 Steven Kreis|year=2004|accessdate=2007-04-24}}</ref>. The computer has forever changed how people live, how [[scientific research]] is conducted, the [[military]] weaponry available, and [[business]] practices. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word ''computer'' may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities. The [[history of computing]] is very complex and thus deserves its own article.


==The nature of computing==
==The nature of computing==
Some people define a computer as a [[machine]] for manipulating [[data]] according to a list of [[instruction (computer science)|instructions]] known as a [[computer program|program]].   However, this definition may only make sense to people who already know what a computer can do. Computers are extremely versatile. In fact, they are ''universal'' information-processing machines, but at the deepest level, what they really do is perform [[arithmetic]]. Today, most computers do arithmetic using the [[binary number system]], because a binary number can be represented by an array of on-off switches.  Each 0 or 1 digit in the binary number is called a [[bit]].  In early electronic computers, the switches used for each digit were electromagnetic switches, also called relays.  Later, vacuum tubes replaced electronic relays, and eventually transistors took the place of both relays and tubes.  Transisters can now be manufactured as tiny devices, almost molecular in size, embedded within [[silicon chips]].  These tiny transistorized computers work on the same principles as the first, giant relay and vacuum tube based computers (which occupied entire buildings).
For some people, a [[machine]] that manipulates [[data]] according to [[instruction (computer science)|instructions]] known as a [[computer program|program]] is the definition of 'computer'. However, this definition may only make sense to people who already know what a computer can do. Computers are extremely versatile. In fact, they are ''universal'' information-processing machines, but at the deepest level, what they really do is perform [[arithmetic]]. Computers and mathematics are closely related. The [[theory of computation]] is a branch of mathematics, and its evolution, pioneered by brilliant twentieth-century mathematicians such as [[Alan Turing]] (among many others), enabled the invention of electronic computers. And as usual in [[mathematics]], their work built on that of earlier mathematicians as described in the [[history of computing]].
 
Computers and mathematics are closely related. Initially, mathematicians and scientists were the only users of computers.  But today, what we tend to think of as a '''computer''' consists not only of the underlying hardware, with its limited [[instruction set]] that performs arithmetic, but also an [[operating system]], which is a set of programs which allow people to use the computer more easily.  The [[operating system]] is [[software]]; it allows people to write new programs for the computer.  Since the early 1980's, universities started offering degrees in the academic disciplines such as [[computer science]] or [[computer engineering]], devoted to the design of hardware and software for computers. These general fields of study consist of many sub-fields, such as [[Computer architecture]] (how to build computer hardware), [[programming languages]] (specifying how people can write programs), or [[compiler]]s (writing the programs that allow people to use a [[programming language]]).  In addition, most academic disciplines, and most businesses, use computers as tools.


==Computing professions and disciplines==
Today, most computers do arithmetic using the [[binary numeral system]], because a binary number can be represented by an array of on-off [[Electronic_switch|switches]], with each 0 or 1 digit, or [[Binary_numeral_system#Use_in_computing|bit]], stored in one switch.  In early electronic computers, the switches used for each digit were electromagnetic switches, also called relays. Later, [[Electronic switch#vacuum tube|vacuum tubes]] replaced electronic relays, and eventually [[Electronic switch#Transistor|transistors]] replaced both relays and tubes. Transistors can now be manufactured as tiny devices, almost molecular in size, embedded within [[Integrated circuit|silicon chips]]. These tiny transistorized computers work on the same principles as the first, giant relay and vacuum tube based computers (which occupied entire buildings)<ref name="TransistorVacuumtube">{{cite web|url=http://nobelprize.org/educational_games/physics/integrated_circuit/history/|title=The History of the Integrated Circuit: The Transistor vs. the Vacuum Tube (The Nobel Foundation) Copyright © Nobel Web AB 2007 |year=2007|accessdate=2007-04-24}}</ref>. More information on how electronic computers work is available in [[computer architecture]].
In the developed world, virtually every [[profession]] makes use of computers. However, certain professional and academic disciplines have evolved that specialize in techniques to construct, program, and use computers. Terminology for different professional disciplines is still somewhat fluid and new fields emerge from time to time: however, some of the major groupings are as follows:


*[[Computer engineering]] is the branch of [[electrical engineering]] that focuses both on hardware and software design, and the interaction between the two.
Initially, mathematicians and scientists were the only users of computers. But today, what we tend to think of as a computer consists not only of the underlying hardware, with its limited [[instruction set]] that performs arithmetic, but also an [[operating system]], which is a set of programs which allow people to use the computer more easily. The [[operating system]] is [[software]] (programs running on a computer). Without an operating system, a computer is not useful; the operating system helps people to write new programs for the computer and to perform many other activities on a computer.
*[[Computer science]] is a traditional name of the academic study of the processes related to computers and computation, such as developing efficient [[algorithm]]s to perform specific class of tasks. It tackles questions as to whether problems can be solved at all using a computer, how efficiently they can be solved, and how to construct efficient programs to compute solutions. A huge array of specialties has developed within computer science to investigate different classes of problems.
*[[Software engineering]] concentrates on methodologies and practices to allow the development of high quality software systems, while minimizing, and reliably estimating, costs and timelines. Software engineers are often called "programmers", because they design and write [[computer program]]s.
*[[Information system]]s concentrates on the use and deployment of computer systems in a wider organizational (usually business) context. Generally, this manifests itself in the IT department of a larger company.
*Many disciplines have developed at the intersection of computers with other professions; one of many examples is experts in [[Geographic information system|geographical information systems]] who apply computer technology to problems of managing geographical information.


There are three major professional societies dedicated to computers, the [[British Computer Society]] the [[Association for Computing Machinery]] and [[IEEE]] [[Computer Society]].
==Academia and professional societies==
Since the early 1980s, most universities have offered majors in academic disciplines such as [[computer science]] or [[computer engineering]], devoted to the design of hardware and software for computers. These general fields of study soon came to consist of many sub-fields. In addition, most academic disciplines, and most businesses, use computers as tools.  


== Security ==
Below are some of the professional and academic disciplines that teach the techniques to construct, program, and use computers.  There is often overlap of functions and terminology across these categories:
*[[Data Security]]
*[[Network Security]]
*[[Hardware Security]]
*[[Organizational Security]]
*[[Security Contingency]]
*[[Disaster Security]]


==See also==
*[[artificial intelligence]] or [[machine learning]] (two sub-fields for solving ''difficult'' problems in software)
*[http://www.acm.org Association for Computing Machinery]
*[[computer architecture]] (the study of how computers work, and how specific computers can be built)
*[[Beowulf cluster]]
*[[compiler]]s (writing programs that allow people to use a [[programming language]])
*[http://www.computer.org IEEE Computer Society]
*[[computer engineering]] (a branch of [[electrical engineering]] that focuses both on hardware and [[operating system]] design)
*[[Operating system]]
*[[computer science]] (the academic study of computers and computation, including aspects of both theory and implementation)
*[[Computer science]]
*[[Geographic information system|geographic information systems]] (combining latitude and longitude information with computer mapping programs)
*[[Open source software]]
*[[information system]]s or [[information technology]] (study of computer systems, usually in a business or organizational context)
*[[Personal computer]]
*[[machine translation]] (software for translating one [[natural language]] into another)
*[[Internet]]
*[[programming languages]] (specifications for how people ought to write computer programs)
*[[software engineering]] (management of the process of creating complex [[software]] systems)


===Other computers===
Professional societies dedicated to computers include the [http://www.bcs.org British Computer Society], the [http://www.acm.org Association for Computing Machinery] (ACM), the [http://www.computer.org IEEE Computer Society] and the German [http://gi-ev.de/english/ Gesellschaft für Informatik e.V.] (GI).


* [[Analog computer]]
==The economics of the computer industry==
* [[Chemical computer]]
Since the 1950s, a vigorous cycle of business activity has arisen from the development of computers, including many corporations engaged in creating computer [[hardware]], [[operating system]]s, or other [[software]]. The business climate has evolved rapidly along with the technology, with some companies being born and meeting their demise in rapid succession, while other companies survived for decades (though usually by changing their focus rapidly in response to industry growth). 
* [[DNA computer]]
* [[Human computer]]
* [[Molecular computer]]
* [[Optical computer]]
* [[Quantum computer]]
* [[Wetware computer]]


See also [[Unconventional computing]].
===The importance of standards===
The ability of many different companies to make computer parts, hardware or software, comes from industry-wide adoption of [[standards]].  Various [[consortium]]s and United States or international standards organizations serve as arbitrators of computing standards, including [[American National Standards Institute]] (ANSI), [[World Wide Web Consortium]] ([[W3C]]), [[European Computer Networking Association]] (ECMA) and [[International Organization for Standardization]] ([[ISO]]). In addition to formal standards, many informal standards have arisen due to consumer "voting" by purchasing certain products.


==Notes and references==
The first written standards for the [[Internet]], as well as the [[ARPANET]], [[NSFNET]], and other research arose from the [[Internet Engineering Task Force]] (originally the ''Network Working Group'') (IETF)<ref name="IETF">{{cite web|url=http://www.garykessler.net/library/ietf_hx.html|title="IETF: History, Background, and Role in Today's Internet"|publisher=Gary C. Kessler|year=1996|accessdate=2007-04-23}}</ref>, born in the late 1960s as a result of the U. S. [[Advanced Research Projects Agency]] ([[ARPA]]) initiative, and leading eventually to the development of the [[Internet]]. The open nature of the IETF, in which any person could submit a proposal (called a [[Request for Comments]], or ''RFC'') was remarkable, and the IETF proved to be as or more effective as formally endorsed standards bodies at creating usable and widely adopted standards. The non-proprietary nature of the RFC process also foreshadowed the later development, in the 1980's, of the [[open source software]] movement.
<div class="references-small">
<references/>
[http://www97.intel.com/discover/JourneyInside/TJI_Intro_lesson1/default.aspx http://www97.intel.com/discover/JourneyInside/TJI_Intro_lesson1/default.aspx]
</div>


==External links==
Some standards also resulted from a deliberate sharing of specifications by industry participants, notably the open specifications leading to the industry-wide [[IBM compatible PC]] beginning in the early 1980's.
* [http://www.computerhistory.org/ Computer History]
* [http://archives.cbc.ca/IDD-1-75-710/science_technology/computers/ CBC Digital Archives – Computer Invasion: A History of Automation in Canada]


===Pace of growth and value===
The quick pace of growth in computer engineering was codified into a widely quoted rule of thumb, called [[Moore's law]]<ref name="MooresLaw">{{cite web|url=http://www.intel.com/technology/mooreslaw/|title=Moore's Law © Intel Corporation|publisher=[[Intel]] Corporation|year=date_unknown|accessdate=2007-04-23}}</ref>, first publicized by Gordon Moore (for many years CEO of [[Intel]]). For decades after the invention of the computer, this economic boom centered in the United States and led to the widespread availability of [[personal computer|personal computers]] (affordable by individuals) in the 1980s. Beginning in the 1990s, the computer industry also spread rapidly overseas, especially into [[Europe]], Russia, China and [[India]].  Computers are now a world-wide phenomenon.


[[Category:CZ Live]]
Related rules have been defined for value of networks.
[[Category:Computers Workgroup]]
==References==
[[Category:History Workgroup]]
{{reflist|2}}[[Category:Suggestion Bot Tag]]

Latest revision as of 16:00, 31 July 2024

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
Gallery [?]
 
This editable Main Article is under development and subject to a disclaimer.

The electronic computer, dating from the middle of the twentieth century, vastly expanded human ability to store and share information. As such, the invention of the computer may be a milestone for humanity on a par with the advent of writing and materials to write on (millennia ago)[1], or with the invention of the printing press (~1450)[2]. The computer has forever changed how people live, how scientific research is conducted, the military weaponry available, and business practices. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word computer may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities. The history of computing is very complex and thus deserves its own article.

The nature of computing

For some people, a machine that manipulates data according to instructions known as a program is the definition of 'computer'. However, this definition may only make sense to people who already know what a computer can do. Computers are extremely versatile. In fact, they are universal information-processing machines, but at the deepest level, what they really do is perform arithmetic. Computers and mathematics are closely related. The theory of computation is a branch of mathematics, and its evolution, pioneered by brilliant twentieth-century mathematicians such as Alan Turing (among many others), enabled the invention of electronic computers. And as usual in mathematics, their work built on that of earlier mathematicians as described in the history of computing.

Today, most computers do arithmetic using the binary numeral system, because a binary number can be represented by an array of on-off switches, with each 0 or 1 digit, or bit, stored in one switch. In early electronic computers, the switches used for each digit were electromagnetic switches, also called relays. Later, vacuum tubes replaced electronic relays, and eventually transistors replaced both relays and tubes. Transistors can now be manufactured as tiny devices, almost molecular in size, embedded within silicon chips. These tiny transistorized computers work on the same principles as the first, giant relay and vacuum tube based computers (which occupied entire buildings)[3]. More information on how electronic computers work is available in computer architecture.

Initially, mathematicians and scientists were the only users of computers. But today, what we tend to think of as a computer consists not only of the underlying hardware, with its limited instruction set that performs arithmetic, but also an operating system, which is a set of programs which allow people to use the computer more easily. The operating system is software (programs running on a computer). Without an operating system, a computer is not useful; the operating system helps people to write new programs for the computer and to perform many other activities on a computer.

Academia and professional societies

Since the early 1980s, most universities have offered majors in academic disciplines such as computer science or computer engineering, devoted to the design of hardware and software for computers. These general fields of study soon came to consist of many sub-fields. In addition, most academic disciplines, and most businesses, use computers as tools.

Below are some of the professional and academic disciplines that teach the techniques to construct, program, and use computers. There is often overlap of functions and terminology across these categories:

Professional societies dedicated to computers include the British Computer Society, the Association for Computing Machinery (ACM), the IEEE Computer Society and the German Gesellschaft für Informatik e.V. (GI).

The economics of the computer industry

Since the 1950s, a vigorous cycle of business activity has arisen from the development of computers, including many corporations engaged in creating computer hardware, operating systems, or other software. The business climate has evolved rapidly along with the technology, with some companies being born and meeting their demise in rapid succession, while other companies survived for decades (though usually by changing their focus rapidly in response to industry growth).

The importance of standards

The ability of many different companies to make computer parts, hardware or software, comes from industry-wide adoption of standards. Various consortiums and United States or international standards organizations serve as arbitrators of computing standards, including American National Standards Institute (ANSI), World Wide Web Consortium (W3C), European Computer Networking Association (ECMA) and International Organization for Standardization (ISO). In addition to formal standards, many informal standards have arisen due to consumer "voting" by purchasing certain products.

The first written standards for the Internet, as well as the ARPANET, NSFNET, and other research arose from the Internet Engineering Task Force (originally the Network Working Group) (IETF)[4], born in the late 1960s as a result of the U. S. Advanced Research Projects Agency (ARPA) initiative, and leading eventually to the development of the Internet. The open nature of the IETF, in which any person could submit a proposal (called a Request for Comments, or RFC) was remarkable, and the IETF proved to be as or more effective as formally endorsed standards bodies at creating usable and widely adopted standards. The non-proprietary nature of the RFC process also foreshadowed the later development, in the 1980's, of the open source software movement.

Some standards also resulted from a deliberate sharing of specifications by industry participants, notably the open specifications leading to the industry-wide IBM compatible PC beginning in the early 1980's.

Pace of growth and value

The quick pace of growth in computer engineering was codified into a widely quoted rule of thumb, called Moore's law[5], first publicized by Gordon Moore (for many years CEO of Intel). For decades after the invention of the computer, this economic boom centered in the United States and led to the widespread availability of personal computers (affordable by individuals) in the 1980s. Beginning in the 1990s, the computer industry also spread rapidly overseas, especially into Europe, Russia, China and India. Computers are now a world-wide phenomenon.

Related rules have been defined for value of networks.

References