Computer

From Rationalwiki
To err is human, to really foul things up requires a computer.
—Bill Vaughan[1]
The great-granddaddy of the thing you're using now.
We need the best
Technology
Icon Tech Portal.svg
Programming for Dummies

A computer is a device capable of automating vast amounts of mathematical computation to solve all kinds of problems, both mathematical and non-mathematical. While early digital computers from the 1930s and 1940s generally were used for numerical computations such as data encryption/decryption and weather simulations, over the next sixty or so years computer technology (helped along by the invention of the transistor at Bell Labs) would advance to the point where computers could be used to enhance or enable activities such as text processing, games, and pr0n. Today's computer is often small enough to fit in a small space on someone's desk (or even in their pocket, in the form of a PDA or a smartphone), and is capable of downloading and viewing a whole lot of porn useful information in a very short time.

A very small computer optimized for interactive use is called a calculator, and is a great way to give a math educator fits trying to figure out what to do with it.

A brief history[edit]

Computers broadly fall into two types: analog and digital. Analog computers rely on continuous calculations whereas digital computers carry out discrete calculations.

The earliest known analog computer, the Antikythera mechanism, dates to between 150 and 100 BCE, and was found in a shipwreck in 1901. Following its re-discovery, its function was not fully understood until around 2006. The device was a sophisticated — though somewhat lacking in engineering precision — astronomical and calendrical calculator.[2] People have used many types of navigational and astronomical analog computers since that time, including the planisphere and the astrolabe. One of the first purely mathematical calculators was the slide rule, which was invented in the 1600s, following the publication of the concept of the logarithm upon which it relies.

The digital computer era began in 1936 with a theoretical paper by Alan Turing, On Computable Numbers. World War II (1939-1945) hastened the development of digital computers, with Konrad Zuse building the first modern computer in 1941. Digital computers' utility, speed and eventual cheapness have largely made analog computers redundant in all but limited, specialized uses.

Things computers cannot do[edit]

  • A computer cannot hold an intelligent conversation with a human being (see Turing test). It can, however, malfunction and lock the pod bay doors shut, so you might have an issue with that.
    • Some computers have been able to reach the point where they can confuse, for some limited period of time, a few horny cybersex addicts on IRC and/or creationists.
  • A computer cannot successfully upload viruses to alien operating systems. (At least not without the right network card and protocol stack. Good thing the wireheads at Area 51 are Mac users.)
  • A computer cannot spontaneously become self aware, take over every computer in the world, launch a massive nuclear attack to wipe out humanity, then send artificially intelligent killer androids back in time to prevent someone from stopping it. Under the right circumstances, however, it can assume total control of the state of Caulyforneeya.
  • A computer cannot do anything at all without being told what to do — and the task of telling a computer what to do is so complex that there are scientists and megacorporations that make most of their living researching better ways to do it.

Basic workings[edit]

A computer can be seen as a set of layers that progressively abstract the complexity of the underlying ones:

Physics[edit]

  • At the lowest level, a network of transistorsWikipedia is printed on a chip. Each transistor is in the scale of few nanometers[3] and acts as a switch: it conducts electricity or not depending on the voltage that is applied to its other input.

Electronics — basics[edit]

  • That basic building block makes it possible to create logical gatesWikipedia, that is, pieces of circuitry that implement the AND, OR, NOT functions (and other minor variations): for example, the output of an AND is 1 only if both inputs are 1 as well (1 representing a higher level of voltage compared to 0).
  • These blocks can be used to implement various functionalities:
    • AddersWikipedia, that sum two binary numbers with a predetermined number of binary digits (bits) (eg. 32, which corresponds to 9 decimal digits, or 64, 20 decimal digits).
    • MultiplexersWikipedia, that select a particular input among many, according to the value specified in another input. Or demultiplexers, that forward an individual input value to a specific output among many.
    • By creating loops in these circuits, it's possible for a circuit to "remember" a value. For example, a flip-flopWikipedia can, depending on its two inputs, keep the current output, change it to 1, change it to 0, or switch it. Groups of these can be used to create a registerWikipedia, that is, a place where to store (or read) a numeric value or other information.
    • When multiplexers and flip-flops are used together, it's possible to create an addressable memory: depending on the input data values, input address values, and control values (read/write), it's possible to ask the memory to read or write data from a particular region.

Electronics — processor[edit]

  • Memory can store data or code (both are represented as opaque sequences of bits, or bytes (groups of 8 bits)). Code is composed of a sequence of instructions, each represented by a numeric code and zero or more operands/parameters. Various instructions exist:
    • instructions that perform arithmetical operations: addition, subtraction, multiplication… (with either constants, or values in addressable memory — RAM, or in use by other parts of the processor — "registersWikipedia");
    • instructions that compare two values (that come from some of the above sources);
    • instructions that jump, either conditionally or unconditionally, to a predetermined portion of code.
    • instructions that write or read data from the RAM of from an input/output device.
  • The task of the CPU is to follow the instructions in the code. This is achieved with a complex set of the above components, along with some registers that let the processor "keep track" of what it is currently supposed to do:
    • A register called instruction pointerWikipedia specifies the numeric address of the instruction that is supposed to be executed next. This value is sent to the RAM, that returns back the instruction at that position.
    • Based on the numeric code of the instruction, the processor decides what to do and orchestrates which gates are supposed to be open, which closed, and which control values should be sent (or not sent) to each sub-component or external device. For example, if the instruction is ADD, data from the first and second operands should flow into the adder, and the output of the adder should also be forwarded to the appropriate location (for example, an intermediate register that will be later used to replace the contents of the first operand, so that the first operand is "incremented by" the second operand).
    • Because it's often required to perform some common operations (like doing something complex like writing a piece of text on the screen), functions can be used to write this repetitive code once. Since the "write text to the screen" function needs to know where to jump back when it has finished its job, a stack of functionsWikipedia currently being executed is kept in RAM. When a function is called, the address where to jump back when it's done is added to the top of the stack. When a function finished, it removes and jumps to whatever is at the top of the stack. In this way, many functions can be called recursively.

Software — operating system[edit]

  • Writing machine code by hand is very complicated, slow and error-prone. A variety of programming languagesWikipedia has been developed so that textual, friendly, and humanly comprehensible code can be automatically converted to machine code understandable to the processor.
  • An operating systemWikipedia can now run on the computer. It is a special program that abstracts the underlying resources and provides a more friendly programming experience for developers, and for final users too:
    • For developers, it provides a wide variety of commonly used functions for interacting with the machine. For example, CreateFile("C:\\Path\\Example.txt") is easier to use rather than organizing a hierarchical store of data addressed by name on the disk and issuing low-level control commands to the hard disk by yourself.
    • It orchestrates the execution of multiple applications at the same time. Each application is given a few milliseconds of time to run. If the application hasn't finished whatever it had to do within that time, a pre-programmed timer in the CPU gives the control back to the operating system, that then decides who's the next application that has pending tasks to do.
    • Applications are prevented by the CPU from executing special instructions that allow low-level access to some devices or to some memory areas. These tasks can however be executed by the operating system on behalf of the application (after the operating system has checked the application can do something, eg. the current user has actually the permissions to read/write a specific file) by invoking a "system call".
    • Some applications for commonly-required tasks can be included in the operating system (eg. a tool for managing files, browsing the web and so on).

Software — apps[edit]

  • You can now download your favorite Angry Birds application, or visit your favorite "sports" site.

Quantum computer[edit]

The idea of quantum computers is basically that, once things get small enough, physics go out the window, meaning that there IS a physical limit to how small silicon transistors get before everything becomes chaos (about 7 nm). Though they are working on making transistors out of different materials, which would allow them to get smaller, it still means that at one point or another it will become physically impossible for Moore's Law to continue, meaning that the growth of computing power will stop, inhibiting scientific progress and causing a global recession due to the computer industry shrinking dramatically. However, a quantum computer would take advantage of the laws of physics going out the window at this size to produce far more powerful computers. This has been claimed possible by D-Wave Systems, which claimed to have created two models, but as yet has not demonstrated a quantum speed increase through independent tests.[4] Quantum supremacy is the point at which a quantum computer can be demonstrated to faster than a "classical" (or non-quantum) computer for any particular problem. It is expected that a quantum computer will reach quantum supremacy once it reaches the size of 45 qubits (or quantum bits).[5]

Quantum computing is not expected to enter the PC market in the foreseeable future due to the bulk of cooling and vacuum systems needed for superconductivity. They also would be prohibitively expensive (and no quantum PCs exist yet), and wouldn't really help anything until Moore's Law ends. However, supercomputers and servers will be drastically improved by them, able to do calculations that take a classical supercomputer far longer, and drastically increasing server speed for the same reason.

See also[edit]

References[edit]


Categories: [Computer science] [Technology]


Download as ZWI file | Last modified: 11/08/2024 16:56:44 | 39 views
☰ Source: https://rationalwiki.org/wiki/Computer | License: CC BY-SA 3.0

ZWI is not signed. [what is this?]