Computational theory of mind

From HandWiki - Reading time: 11 min

Short description: Family of views in the philosophy of mind


In philosophy of mind, the computational theory of mind (CTM), also known as computationalism, is a family of views that hold that the human mind is an information processing system and that cognition and consciousness together are a form of computation. Warren McCulloch and Walter Pitts (1943) were the first to suggest that neural activity is computational. They argued that neural computations explain cognition.[1] The theory was proposed in its modern form by Hilary Putnam in 1967, and developed by his PhD student, philosopher, and cognitive scientist Jerry Fodor in the 1960s, 1970s, and 1980s.[2][3] It was vigorously disputed in analytic philosophy in the 1990s due to work by Putnam himself, John Searle, and others.

The computational theory of mind holds that the mind is a computational system that is realized (i.e. physically implemented) by neural activity in the brain. The theory can be elaborated in many ways and varies largely based on how the term computation is understood. Computation is commonly understood in terms of Turing machines which manipulate symbols according to a rule, in combination with the internal state of the machine. The critical aspect of such a computational model is that we can abstract away from particular physical details of the machine that is implementing the computation.[3] For example, the appropriate computation could be implemented either by silicon chips or biological neural networks, so long as there is a series of outputs based on manipulations of inputs and internal states, performed according to a rule. CTM therefore holds that the mind is not simply analogous to a computer program, but that it is literally a computational system.[3]

Computational theories of mind are often said to require mental representation because 'input' into a computation comes in the form of symbols or representations of other objects. A computer cannot compute an actual object but must interpret and represent the object in some form and then compute the representation. The computational theory of mind is related to the representational theory of mind in that they both require that mental states are representations. However, the representational theory of mind shifts the focus to the symbols being manipulated. This approach better accounts for systematicity and productivity.[3] In Fodor's original views, the computational theory of mind is also related to the language of thought. The language of thought theory allows the mind to process more complex representations with the help of semantics. (See below in semantics of mental states).

Recent work has suggested that we make a distinction between the mind and cognition. Building from the tradition of McCulloch and Pitts, the computational theory of cognition (CTC) states that neural computations explain cognition.[1] The computational theory of mind asserts that not only cognition, but also phenomenal consciousness or qualia, are computational. That is to say, CTM entails CTC. While phenomenal consciousness could fulfill some other functional role, computational theory of cognition leaves open the possibility that some aspects of the mind could be non-computational. CTC, therefore, provides an important explanatory framework for understanding neural networks, while avoiding counter-arguments that center around phenomenal consciousness.

"Computer metaphor"

Computational theory of mind is not the same as the computer metaphor, comparing the mind to a modern-day digital computer.[4] Computational theory just uses some of the same principles as those found in digital computing.[4] While the computer metaphor draws an analogy between the mind as software and the brain as hardware, CTM is the claim that the mind is a computational system. More specifically, it states that a computational simulation of a mind is sufficient for the actual presence of a mind, and that a mind truly can be simulated computationally.

'Computational system' is not meant to mean a modern-day electronic computer. Rather, a computational system is a symbol manipulator that follows step-by-step functions to compute input and form output. Alan Turing describes this type of computer in his concept of a Turing machine.

Early proponents

One of the earliest proponents of the computational theory of mind was Thomas Hobbes who said, "by reasoning, I understand computation. And to compute is to collect the sum of many things added together at the same time, or to know the remainder when one thing has been taken from another. To reason, therefore, is the same as to add or to subtract."[5] Since Hobbes lived before the contemporary identification of computing with instantiating effective procedures, he cannot be interpreted as explicitly endorsing the computational theory of mind, in the contemporary sense.

Criticism

A range of arguments have been proposed against physicalist conceptions used in computational theories of mind.

An early, though indirect, criticism of the computational theory of mind comes from philosopher John Searle. In his thought experiment known as the Chinese room, Searle attempts to refute the claims that artificially intelligent agents can be said to have intentionality and understanding and that these systems, because they can be said to be minds themselves, are sufficient for the study of the human mind.[6] Searle asks us to imagine that there is a man in a room with no way of communicating with anyone or anything outside of the room except for a piece of paper with symbols written on it that is passed under the door. With the paper, the man is to use a series of provided rule books to return paper containing different symbols. Unknown to the man in the room, these symbols are of a Chinese language, and this process generates a conversation that a Chinese speaker outside of the room can actually understand. Searle contends that the man in the room does not understand the Chinese conversation. This is essentially what the computational theory of mind presents us—a model in which the mind simply decodes symbols and outputs more symbols. Searle argues that this is not real understanding or intentionality. This was originally written as a repudiation of the idea that computers work like minds.

Searle has further raised questions about what exactly constitutes a computation:

the wall behind my back is right now implementing the WordStar program, because there is some pattern of molecule movements that is isomorphic with the formal structure of WordStar. But if the wall is implementing WordStar, if it is a big enough wall it is implementing any program, including any program implemented in the brain.[7]

Objections like Searle's might be called insufficiency objections. They claim that computational theories of mind fail because computation is insufficient to account for some capacity of the mind. Arguments from qualia, such as Frank Jackson's knowledge argument, can be understood as objections to computational theories of mind in this way—though they take aim at physicalist conceptions of the mind in general, and not computational theories specifically.[citation needed]

There are also objections which are directly tailored for computational theories of mind.

Putnam himself (see in particular Representation and Reality and the first part of Renewing Philosophy) became a prominent critic of computationalism for a variety of reasons, including ones related to Searle's Chinese room arguments, questions of world-word reference relations, and thoughts about the mind-body problem. Regarding functionalism in particular, Putnam has claimed along lines similar to, but more general than Searle's arguments, that the question of whether the human mind can implement computational states is not relevant to the question of the nature of mind, because "every ordinary open system realizes every abstract finite automaton."[8] Computationalists have responded by aiming to develop criteria describing what exactly counts as an implementation.[9][10][11]

Roger Penrose has proposed the idea that the human mind does not use a knowably sound calculation procedure to understand and discover mathematical intricacies. This would mean that a normal Turing complete computer would not be able to ascertain certain mathematical truths that human minds can.[12]

Pancomputationalism

CTM raises a question that remains a subject of debate: what does it take for a physical system (such as a mind, or an artificial computer) to perform computations? A very straightforward account is based on a simple mapping between abstract mathematical computations and physical systems: a system performs computation C if and only if there is a mapping between a sequence of states individuated by C and a sequence of states individuated by a physical description of the system.[13][8]

Putnam (1988) and Searle (1992) argue that this simple mapping account (SMA) trivializes the empirical import of computational descriptions.[8][14] As Putnam put it, “everything is a Probabilistic Automaton under some Description”.[15]  Even rocks, walls, and buckets of water—contrary to appearances—are computing systems. Gualtiero Piccinini identifies different versions of Pancomputationalism.[16]

In response to the trivialization criticism, and to restrict SMA, philosophers of mind have offered different accounts of computational systems. These typically include causal account, semantic account, syntactic account, and mechanistic account.[17] Instead of a semantic restriction, the syntactic account imposes a syntactic restriction.[17] The mechanistic account was first introduced by Gualtiero Piccinini in 2007.[18]

Notable theorists

  • Daniel Dennett proposed the multiple drafts model, in which consciousness seems linear but is actually blurry and gappy, distributed over space and time in the brain. Consciousness is the computation, there is no extra step in which you become conscious of the computation.
  • Jerry Fodor argues that mental states, such as beliefs and desires, are relations between individuals and mental representations. He maintains that these representations can only be correctly explained in terms of a language of thought (LOT) in the mind. Further, this language of thought itself is codified in the brain, not just a useful explanatory tool. Fodor adheres to a species of functionalism, maintaining that thinking and other mental processes consist primarily of computations operating on the syntax of the representations that make up the language of thought. In later work (Concepts and The Elm and the Expert), Fodor has refined and even questioned some of his original computationalist views, and adopted LOT2, a highly modified version of LOT.
  • David Marr proposed that cognitive processes have three levels of description: the computational level, which describes that computational problem solved by the cognitive process; the algorithmic level, which presents the algorithm used for computing the problem postulated at the computational level; and the implementational level, which describes the physical implementation of the algorithm postulated at the algorithmic level in the brain.
  • Ulric Neisser coined the term cognitive psychology in his book with that title published in 1967. Neisser characterizes people as dynamic information-processing systems whose mental operations might be described in computational terms.
  • Steven Pinker described language instinct as an evolved, built-in capacity to learn language (if not writing). His 1997 book How the Mind Works sought to popularize the computational theory of mind for wide audiences.
  • Hilary Putnam proposed functionalism to describe consciousness, asserting that it is the computation that equates to consciousness, regardless of whether the computation is operating in a brain or in a computer.

Alternative theories

See also


References

  1. 1.0 1.1 Piccinini, Gualtierro & Bahar, Sonya, 2012. "Neural Computation and the Computational Theory of Cognition" in Cognitive Science. https://onlinelibrary.wiley.com/doi/epdf/10.1111/cogs.12012
  2. Putnam, Hilary, 1961. "Brains and Behavior", originally read as part of the program of the American Association for the Advancement of Science, Section L (History and Philosophy of Science), December 27, 1961, reprinted in Block (1983), and also along with other papers on the topic in Putnam, Mathematics, Matter and Method (1979)
  3. 3.0 3.1 3.2 3.3 Horst, Steven, (2005) "The Computational Theory of Mind" in The Stanford Encyclopedia of Philosophy
  4. 4.0 4.1 Pinker, Steven. The Blank Slate. New York: Penguin. 2002
  5. Hobbes, Thomas "De Corpore"
  6. Searle, J.R. (1980), "Minds, brains, and programs", The Behavioral and Brain Sciences 3 (3): 417–457, doi:10.1017/S0140525X00005756, http://cogprints.org/7150/1/10.1.1.83.5248.pdf 
  7. Searle, J.R. (1992), The Rediscovery of the Mind 
  8. 8.0 8.1 8.2 Putnam, H. (1988). Representation and Reality. Cambridge, Massachusetts: MIT Press. ISBN 978-0-262-66074-7. OCLC 951364040. 
  9. Chalmers, D.J. (1996), "Does a rock implement every finite-state automaton?", Synthese 108 (3): 309–333, doi:10.1007/BF00413692, http://cogprints.ecs.soton.ac.uk/archive/00000226/00/199708001.html, retrieved 2009-05-27 
  10. Edelman, Shimon (2008), "On the Nature of Minds, or: Truth and Consequences", Journal of Experimental and Theoretical AI 20 (3): 181–196, doi:10.1080/09528130802319086, http://kybele.psych.cornell.edu/~edelman/Edelman-JETAI.pdf, retrieved 2009-06-12 
  11. Blackmon, James (2012). "Searle's Wall". Erkenntnis 78: 109–117. doi:10.1007/s10670-012-9405-4. 
  12. Roger Penrose, "Mathematical Intelligence," in Jean Khalfa, editor, What is Intelligence?, chapter 5, pages 107-136. Cambridge University Press, Cambridge, United Kingdom, 1994
  13. Ullian, Joseph S. (March 1971). "Hilary Putnam. Minds and machines. Minds and machines, edited by Alan Ross Anderson, Prentice-Hall, Inc., Englewood Cliffs, New Jersey, 1964, pp. 72–97. (Reprinted from Dimensions of mind, A symposium, edited by Sidney Hook, New York University Press, New York 1960, pp. 148–179.)". Journal of Symbolic Logic 36 (1): 177. doi:10.2307/2271581. ISSN 0022-4812. http://dx.doi.org/10.2307/2271581. 
  14. Smythies, J. R. (November 1993). "The Rediscovery of the Mind. By J. R. Searle. (Pp. 286; $22.50.) MIT Press: Cambridge, Mass.1992.". Psychological Medicine 23 (4): 1043–1046. doi:10.1017/s0033291700026507. ISSN 0033-2917. http://dx.doi.org/10.1017/s0033291700026507. 
  15. "ART, MIND, and RELIGION". Philosophical Books 8 (3): 32. October 1967. doi:10.1111/j.1468-0149.1967.tb02995.x. ISSN 0031-8051. http://dx.doi.org/10.1111/j.1468-0149.1967.tb02995.x. 
  16. Piccinini, Gualtiero (2015-06-01), "The Mechanistic Account", Physical Computation (Oxford University Press): pp. 118–151, doi:10.1093/acprof:oso/9780199658855.003.0008, ISBN 978-0-19-965885-5, http://dx.doi.org/10.1093/acprof:oso/9780199658855.003.0008, retrieved 2020-12-12 
  17. 17.0 17.1 Piccinini, Gualtiero (2017), Zalta, Edward N., ed., Computation in Physical Systems (Summer 2017 ed.), Metaphysics Research Lab, Stanford University, https://plato.stanford.edu/archives/sum2017/entries/computation-physicalsystems/, retrieved 2020-12-12 
  18. Piccinini, Gualtiero (October 2007). "Computing Mechanisms*". Philosophy of Science 74 (4): 501–526. doi:10.1086/522851. ISSN 0031-8248. http://dx.doi.org/10.1086/522851. 

Further reading

  • Block, Ned, ed (1983). Readings in Philosophy of Psychology. 1. Cambridge, Massachusetts: Harvard University Press. 
  • Chalmers, David (2011). "A computational foundation for the study of cognition". Journal of Cognitive Science 12 (4): 323-357. https://philpapers.org/rec/CHAACF-2. 
  • Crane, Tim (2003). The Mechanical Mind: A Philosophical Introduction to Minds, Machines, and Mental Representation. New York, NY: Routledge. 
  • Fodor, Jerry (1975). The Language of Thought. Cambridge, Massachusetts: MIT Press. 
  • Fodor, Jerry (1995). The Elm and the Expert: Mentalese and Its Semantics. Cambridge, Massachusetts: MIT Press. 
  • Fodor, Jerry (1998). Concepts: Where Cognitive Science Went Wrong. Oxford and New York: Oxford University Press. 
  • Fodor, Jerry (2000). The Mind Doesn't Work That Way: The Scope and Limits of Computational Psychology. Cambridge, MA: MIT Press. 
  • Fodor, Jerry (2010). LOT2: The Language of Thought Revisited. Oxford and New York: Oxford University Press. 
  • Harnad, Stevan (1994). "Computation Is Just Interpretable Symbol Manipulation: Cognition Isn't". Minds and Machines 4 (4): 379–390. doi:10.1007/bf00974165. 
  • Marr, David (1981). Vision: A Computational Investigation into the Human Representation and Processing of Visual Information. Cambridge, Massachusetts: MIT Press. 
  • Piccinini, Gualtiero (2015). Physical Computation: A Mechanistic Account. Oxford University Press. 
  • Pinker, Steven (1997). How the Mind Works. Norton. ISBN 978-0393045352. 
  • Putnam, Hilary (1979). Philosophical Papers: Mathematics, Matter, and Method. 1.. Cambridge, Massachusetts: MIT Press. 
  • Putnam, Hilary (1995). Renewing Philosophy. Cambridge, Massachusetts: Harvard University Press. 
  • Pylyshyn, Zenon (1984). Computation and Cognition. Cambridge, Massachusetts: MIT Press. 
  • Searle, John (1992). The Rediscovery of the Mind. Cambridge, Massachusetts: MIT Press. 
  • Zalta, Edward N., ed. "The Computational Theory of Mind". Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/computational-mind/. 

External links




Licensed under CC BY-SA 3.0 | Source: https://handwiki.org/wiki/Computational_theory_of_mind
6 views | Status: cached on July 16 2024 04:12:25
↧ Download this article as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF