Computer science is the study of the theoretical foundations of information and computation and how they can be implemented in computer systems.[1][2][3] It is a broad discipline, with many fields. For example, computer programming involves the use of specific programming languages to craft solutions to concrete computational problems. Computer graphics relies on algorithms that help generate and alter visual images synthetically. Computability theory helps us understand what may or may not be computed, using current computers. On a fundamental level, computer science enables us to communicate with a machine, allowing us to translate our thoughts and ideas into machine language, to give instructions that the machine can follow, and to obtain the types of responses we desire.
Computer science has touched practically every aspect of modern-day life. For instance, it has led to the invention of general-purpose computers, for tasks ranging from routine writing and computing to specialized decision making. It has led to the development of the Internet, search engines, e-mail, instant messaging, and e-commerce, bringing about a revolution in our ability to access and communicate information and to conduct financial transactions. By enabling the development of computer graphics and sound systems, it has led to new ways of creating slides, videos, and films. These, in turn, have given birth to new approaches for teaching and learning. For research in various fields, computer science has greatly enhanced the processes of data gathering, storage, and analysis, including the creation of computer models. By fostering the development of computer chips, it has aided in the control of such things as mobile phones, home appliances, security alarms, heating and cooling systems, and space shuttles. In medicine, it has led to the creation of new diagnostic and therapeutic approaches. For national defense, it has led to the development of precision weaponry. Through the development of robots, it has enabled the automation of industrial processes and helped in such tasks as defusing bombs, exploring uncharted territories, and finding disaster victims.
On the down side, knowledge of computer science can also be misused, such as in creating computer viruses, computer hacking, and "phishing" for private information. These activities can lead to huge economic losses, theft of identity and confidential information, and breach of national security. In addition, the fruits of computer science—particularly the Internet and its associated forms of communication—can be used to spread falsehoods, motivate immoral or unethical behavior, or promote acts of terrorism and war. Such misuse can create enormous problems for society.
The earliest known tool for computation was the abacus, thought to have been invented in Babylon around 2400 B.C.E. Its original style of usage was by lines drawn in sand with pebbles. In the fifth century B.C.E., Indian grammarian Pāṇini formulated sophisticated rules of grammar for Sanskrit. His work became the forerunner to modern formal language theory and a precursor to computing. Between 200 B.C.E. and 400 C.E., Jaina mathematicians in India invented the logarithm. Much later, in the early sixteenth century, John Napier discovered logarithms for computational purposes, and that was followed by the invention of various calculating tools.
None of the early computational devices were computers in the modern sense. It took considerable advances in mathematics and theory before the first modern computers could be designed. Charles Babbage, called the "father of computing," described the first programmable device—the "analytical engine"—in 1837, more than a century before the first computers were built. His engine, although never successfully constructed, was designed to be programmed—the key feature that set it apart from all preceding devices.
Prior to the 1920s, the term computer was used in referring to a human clerk who performed calculations, usually led by a physicist. Thousands of these clerks, mostly women with a degree in calculus, were employed in commerce, government, and research establishments. After the 1920s, the expression computing machine was applied to any machine that performed the work of a human computer—especially work that involved following a list of mathematical instructions repetitively.
Kurt Gödel, Alonzo Church, and Alan Turing were among the early researchers in the field that came to be called computer science. In 1931, Gödel introduced his "incompleteness theorem," showing that there are limits to what can be proved and disproved within a formal system. Later, Gödel and others defined and described these formal systems.
In 1936, Turing and Church introduced the formalization of an algorithm (set of mathematical instructions), with limits on what can be computed, and a "purely mechanical" model for computing. These topics are covered by what is now called the Church–Turing thesis, which claims that any calculation that is possible can be performed by an algorithm running on a mechanical calculation device (such as an electronic computer), if sufficient time and storage space are available.
Turing, who has been called the "father of computer science," also described the "Turing machine"—a theoretical machine with an infinitely long tape and a read/write head that moves along the tape, changing the values along the way. Clearly, such a machine could never be built, but the model could simulate the computation of algorithms that can be performed on modern computers.
Up to and during the 1930s, electrical engineers built electronic circuits to solve mathematical and logic problems in an ad hoc manner, lacking theoretical rigor. This changed when Claude E. Shannon published his 1937 master's thesis, "A Symbolic Analysis of Relay and Switching Circuits." He recognized that George Boole's work could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. This concept, using the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers. Shannon's thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II.
Shannon went on to found the field of information theory with his 1948 paper on "A Mathematical Theory of Communication." In it, he applied probability theory to the problem of how to best encode the information a sender wants to transmit. This work is one of the theoretical foundations for many areas of study, including data compression and cryptography.
During the 1940s, with the onset of electronic digital equipment, the phrase computing machines gradually gave away to just computers, referring to machines that performed the types of calculations done by human clerks in earlier years.
Over time, as it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general and branched into many subfields, such as artificial intelligence. Computer science began to be established as a distinct academic discipline in the 1960s, with the creation of the first computer science departments and degree programs.[4]
In 1975 Bill Gates cofounded Micro-Soft, later known as Microsoft Corporation, with former classmate Paul Allen. Landing lucrative deals developing the operating systems for the computers of that time, and employing aggressive marketing practices, Microsoft became the largest software company in the world. Currently, its premiere product, the Windows operating system, dominates the market by several orders of magnitude.
One year after Gates founded Microsoft, another young man, Steve Jobs founded Apple Computer Co. with Steve Wozniak. From 1976 onward, Apple led the personal computer market with its Apple I, II, and III lines of desktop computers, until IBM (International Business Machines Corporation) released its IBM-PC in 1980. The rivalry between Apple and Microsoft has continued well into the twenty-first century, with Apple possessing a relatively small portion of the computer market. With computers getting smaller and more powerful, they have become indispensable to modern life, and some are even used in decision-making capacities.
Despite its relatively short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society. These include:
Despite its name, computer science rarely involves the study of computers themselves. Renowned computer scientist Edsger Dijkstra is often quoted as saying, "Computer science is no more about computers than astronomy is about telescopes." It may be argued that Dijkstra was referring to a computer in a narrow sense—that is, a digital computer. If, however, a computer were defined as "any physical system or mathematical model in which a computation occurs," then the definition of computer science as "the science that studies computers" is broadened beyond the study of digital computers.
The design and deployment of physical computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often placed under information technology or information systems.
On the other hand, some have criticized computer science as being insufficiently scientific. This view is espoused in the statement "Science is to computer science as hydrodynamics is to plumbing," credited to Stan Kelly-Bootle[8] and others. There has, however, been much cross-fertilization of ideas between the various computer-related disciplines. In addition, computer science research has often crossed into other disciplines, such as artificial intelligence, cognitive science, physics (quantum computing), and linguistics.
Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines.[9] Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.
The relationship between computer science and software engineering is a contentious issue, further muddied by disputes over what the term "software engineering" means, and how computer science is defined. Some people believe that software engineering is a subset of computer science. Others, including David Parnas, believe that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals—thus making them different disciplines.[10] Yet others maintain that software cannot be engineered at all.
All links retrieved March 17, 2017.
New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:
The history of this article since it was imported to New World Encyclopedia:
Note: Some restrictions may apply to use of individual images which are separately licensed.