Information processing

From Wikidoc - Reading time: 2 min


Overview[edit | edit source]

Information processing is the change (processing) of information in any manner detectable by an observer. As such, it is a process which describes everything which happens (changes) in the universe, from the falling of a rock (a change in position) to the printing of a text file from a digital computer system. In the latter case, an information processor is changing the form of presentation of that text file. Information processing may more specifically be defined in terms by Claude E. Shannon as the conversion of latent information into manifest information[citation needed]. Latent and manifest information is defined through the terms of equivocation (remaining uncertainty, what value the sender has actually chosen), dissipation (uncertainty of the sender what the receiver has actually received) and transformation (saved effort of questioning - equivocation minus dissipation)[citation needed].

Practical Information Processing can be described as a cycle, where data (which may have no inherent meaning to the observer) is converted into information (which does have meaning to the observer). This conversion takes one of three forms:

  • Computation utilizes mathematics (specifically arithmetics) to create the information from data. Example: a cash register (either mechanical or digital) uses addition to convert the individual item prices (data) into the total amount owed to the store (information).
  • Transduction is the conversion of one type of energy into another type. Example: A mechanical speaker converts an electric signal (data) into sound waves (information).
  • Translation is the conversion of a string of symbols from one set into another. Example: a person fluent two languages could rewrite a document that is written in a language that the observer does not understand (data) into a language that the observer does understand (information). Note that translation is the only Information processing form that can not yet be performed purely by a machine. It requires an organic brain (some electronic computer programs can 'translate' but in reality they are using complex programs, which actually rely solely on computation to complete the process).

Within the field of cognitive psychology, information processing is an approach to the goal of understanding human thinking. It arose in the 1940s and 1950s. The essence of the approach is to see cognition as being essentially computational in nature, with mind being the software and the brain being the hardware. The information processing approach in psychology is closely allied to cognitivism in psychology and functionalism in philosophy although the terms are not quite synonymous. Information processing may be sequential or parallel, either of which may be centralized or decentralized (distributed). The parallel distributed processing approach of the mid-1980s became popular under the name connectionism. In the early 1950s Friedrich Hayek was ahead of his time when he posited the idea of spontaneous order in the brain arising out of decentralized networks of simple units (neurons). However, Hayek is rarely cited in the literature of connectionism.

See also[edit | edit source]

References[edit | edit source]

Bibliography[edit | edit source]

  • Lehrl, S., and Fischer, B. (1990), "A Basic Information Psychological Parameter (BIP) for the Reconstruction of Concepts of Intelligence", European Journal of Personality, 4, 259-286. Eprint
  • Allen Newell (1990), Unified Theories of Cognition, Harvard University Press, Cambridge, MA.

External links[edit | edit source]

de:Informationsverarbeitung fa:پردازش اطلاعات ko:정보 처리 hr:Obrada informacija it:Trattamento dell'informazione th:การประมวลผลสารสนเทศ Template:WH Template:WS


Licensed under CC BY-SA 3.0 | Source: https://www.wikidoc.org/index.php/Information_processing
8 views | Status: cached on September 16 2024 19:42:21
↧ Download this article as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF