Author | Noam Chomsky |
---|---|
Language | English |
Subject | Natural language syntax |
Publisher | Mouton & Co. |
Publication date | February 1957 |
Media type | |
Pages | 117 |
Preceded by | The Logical Structure of Linguistic Theory (unpublished mimeographed or microfilm version) |
Followed by | Aspects of the Theory of Syntax |
Syntactic Structures is an important work in linguistics by American linguist Noam Chomsky, originally published in 1957. A short monograph of about a hundred pages, it is recognized as one of the most significant and influential linguistic studies of the 20th century.[1][2] It contains the now-famous sentence "Colorless green ideas sleep furiously",[3] which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning, thus arguing for the independence of syntax (the study of sentence structures) from semantics (the study of meaning).[4][note 1]
Based on lecture notes he had prepared for his students at the Massachusetts Institute of Technology in the mid-1950s,[note 2] Syntactic Structures was Chomsky's first book on linguistics and reflected the contemporary developments in early generative grammar. In it, Chomsky introduced his idea of a transformational generative grammar, succinctly synthesizing and integrating the concepts of transformation (pioneered by his mentor Zellig Harris, but used in a precise and integrative way by Chomsky), morphophonemic rules (introduced by Leonard Bloomfield) and an item-and-process style of grammar description (developed by Charles Hockett).[5][6][7] Here, Chomsky's approach to syntax is fully formal (based on symbols and rules). At its base, Chomsky uses phrase structure rules,[note 3] which break down sentences into smaller parts. These are combined with a new kind of rules which Chomsky called "transformations". This procedure gives rise to different sentence structures.[8] Chomsky stated that this limited set of rules "generates"[9][note 4] all and only the grammatical sentences of a given language, which are infinite in number (not too dissimilar to a notion introduced earlier by Danish linguist Louis Hjelmslev[5]).[10][11] Although not explicitly stated in the book itself, this way of study was later interpreted to have valued language's innate place in the mind over language as learned behavior,[note 5][12][note 6][13]
Written when Chomsky was still an unknown scholar,[note 7] Syntactic Structures had a major impact on the study of knowledge, mind and mental processes, becoming an influential work in the formation of the field of cognitive science.[14] It also significantly influenced research on computers and the brain.[note 8] The importance of Syntactic Structures lies in Chomsky's persuasion for a biological perspective on language at a time when it was unusual, and in the context of formal linguistics where it was unexpected.[12][dubious – discuss] The book led to Chomsky's eventual recognition as one of the founders of what is now known as sociobiology.[15][16] Some specialists have questioned Chomsky's theory, believing it is wrong to describe language as an ideal system. They also say it gives less value to the gathering and testing of data.[note 9] Nevertheless, Syntactic Structures is credited to have changed the course of linguistics in general and American linguistics in particular in the second half of the 20th century.
Chomsky's interest in language started at an early age. When he was twelve, he studied Hebrew grammar under his father.[note 10] He also studied Arabic in his first year at the University of Pennsylvania.[note 11] In 1947, he met Zellig Harris, the founder of the college's linguistics department. Harris was an established linguist. He did research in the way laid out by American linguist Leonard Bloomfield.[17] He let Chomsky proofread a copy of his book Methods in Structural Linguistics (1951).[note 12] This is how Chomsky came to know a formal theory of linguistics. He soon decided to major in the subject.[18][note 13]
For his thesis, Chomsky set out to apply Harris's methods to Hebrew. Following Harris's advice, he studied logic, philosophy, and mathematics.[19] He found Harris's views on language much like Nelson Goodman's work on philosophical systems.[note 14] Chomsky was also influenced by the works of W. V. O. Quine[note 15] and Rudolf Carnap.[note 16][note 17] Quine showed that one cannot completely verify the meaning of a statement through observations.[20] Carnap had developed a formal theory of language. It used symbols and rules that did not refer to meaning.[21]
From there on, Chomsky tried to build a grammar of Hebrew. Such a grammar would generate the phonetic or sound forms of sentences. To this end, he organized Harris's methods in a different way.[note 18] To describe sentence forms and structures, he came up with a set of recursive rules. These are rules that refer back to themselves. He also found that there were many different ways of presenting the grammar. He tried to develop a method to measure how simple a grammar is.[note 19] For this, he looked for "generalizations" among the possible sets of grammatical rules.[note 20] Chomsky completed his undergraduate thesis The Morphophonemics of Modern Hebrew in 1949. He then published a revised and expanded version of it as his master's thesis in 1951.
In 1951, Chomsky became a Junior Fellow at Harvard University.[22] There, he tried to build an all-formal linguistic theory.[note 21] It was a clear break with the existing tradition of language study.[23] In 1953, Chomsky published his first paper as a scholar.[24] In it he tried to adapt the symbol-based language of logic to describe the syntax of a human language. During his fellowship, Chomsky organized all his ideas into a huge manuscript. It was around 1,000 typewritten pages long. He gave it the title The Logical Structure of Linguistic Theory (LSLT).[25]
In 1955, Chomsky found a job at MIT. He worked there as a linguist in the mechanical translation project.[26] The same year he submitted his doctoral dissertation to the University of Pennsylvania. The university granted him a Ph.D. for his thesis Transformational Analysis. In fact, it was just the ninth chapter of LSLT.[27]
This section may be unbalanced toward certain viewpoints. (July 2023) |
At the time of its publication, Syntactic Structures presented the state of the art of Zellig Harris's formal model of language analysis which is called transformational generative grammar.[5][need quotation to verify] It can also be said to present Chomsky's version or Chomsky's theory because there is some original input on a more technical level.[citation needed] The central concepts of the model, however, follow from Louis Hjelmslev's book Prolegomena to a Theory of Language, which was published in 1943 in Danish and followed by an English translation by Francis J. Whitfield in 1953.[5][6][7][28][need quotation to verify] The book sets up an algebraic tool for linguistic analysis which consists of terminals and inventories of all different types of linguistic units, similar to terminal and nonterminal symbols in formal grammars.[citation needed] First, it functions as a descriptive device, or as Hjelmslev explains it[original research?]:
"We demand for example from the theory of language that it allow to describe correctly and exhaustively not only such given French text, but also all existing French texts, and not only these but also all possible and conceivable French texts."[29]
When this work is done to a satisfactory level, it will also become possible to predict all the grammatical sentences of a given language[original research?]:
"Thanks to the linguistic knowledge thus acquired, we will be able to construct, for the same language, all conceivable or theoretically possible texts."[30]
Hjelmslev also points out that an algorithmic description of a language could generate an infinite number of products from a finite number of primitive elements:[5][need quotation to verify]
"When we compare the inventories yielded at the various stages of the deduction, their size will usually turn out to decrease as the procedure goes on. If the text is unrestricted, i.e., capable of being prolonged through constant addition of further parts … it will be possible to register an unrestricted number of sentences"[13]
These are logical consequences of the mathematical systems proposed by David Hilbert and Rudolf Carnap which were first adopted into linguistics by Hjelmslev[5][need quotation to verify] whose ideas are reiterated by Chomsky:
"The fundamental aim in the linguistic analysis of a language L is to separate the grammatical sequences which are the sentences of L from the ungrammatical sequences which are not sentences of L. The grammar of L will thus be a device that generates all of the grammatical sequences of L and none of the ungrammatical ones"
— Noam Chomsky, Syntactic Structures
Chomsky likewise[original research?] states that a recursive device such as closed loops would allow the grammar to generate an infinite number of sentences.[31]
Although the Bloomfieldian school of early to mid-20th century linguists were nicknamed 'American structuralists', they essentially rejected the basic tenets of structuralism: that linguistic form is explained through meaning, and that linguistics belongs to the domain of sociology.[5][6][7][need quotation to verify]
Chomsky, like Harris and other American linguists, agreed that there is no causal link from semantics to syntax.[5][need quotation to verify]
How to translate this idea into a scientific statement remained a vexing issue in American linguistics for decades.[5][need quotation to verify] Harris and Rulon Wells justified analyzing the object as part of the verb phrase per 'economy'; but this term, again, merely suggested the perceived 'easiness' of the practice.[32][need quotation to verify]
In Syntactic Structures, Chomsky changes the meaning of Hjelmslev's principle of arbitrariness which meant that the generative calculus is merely a tool for the linguist and not a structure in reality.[5][need quotation to verify][13] David Lightfoot however points out in his introduction to the second edition that there were few points of true interest in Syntactic Structures itself,[citation needed] and the eventual interpretations that the rules or structures are 'cognitive', innate, or biological would have been made elsewhere, especially in the context of a debate between Chomsky and the advocates of behaviorism.[12] But decades later, Chomsky makes the clear statement that syntactic structures, including the object as a dependent of the verb phrase, are caused by a genetic mutation in humans.[33]
In 1955, Chomsky had a doctorate in linguistics. Even so, he struggled at first to publish his theory and views on language.[34] He offered the manuscript of The Logical Structure of Linguistic Theory (LSLT) for publication. But MIT's Technology Press refused to publish it. He also saw a paper promptly rejected by the academic linguistics journal WORD.[note 22] So he remained an outsider to the field of linguistics. His reviews and articles at the time were mostly published in non-linguistic journals.[35][note 23]
Mouton & Co. was a Dutch publishing house based in The Hague. They had gained academic reputation by publishing works on Slavic Studies since 1954.[36] Particularly, they had published works by linguists Nicolaas Van Wijk and Roman Jakobson. Soon they started a new series called Janua Linguarum or the "Gate of Languages."[37] It was intended to be a series of "small monographs" on general linguistics.[note 24] The first volume of the Janua Linguarum series was written by Roman Jakobson and Morris Halle. It was called Fundamentals of Language, published in 1956.[38] Chomsky had already met Jakobson, a professor at Harvard University, during his fellowship years. Halle was Chomsky's graduate classmate at Harvard and then a close colleague at MIT. In 1956, Chomsky and Halle collaborated to write an article on phonology, published in a festschrift for Jakobson.[39] The festschrift was published by Mouton in 1956.
Cornelis van Schooneveld was the editor of the Janua Linguarum series at Mouton. He was a Dutch linguist and a direct student of Jakobson.[40] He was looking for monographs to publish for his series. Consequently, he visited Chomsky at MIT in 1956. With Morris Halle's (and possibly Jakobson's) mediation,[36] Chomsky showed van Schooneveld his notes for his introductory linguistics course for undergraduate students. Van Schooneveld took an interest in them. He offered to publish an elaborate version of them at Mouton, to which Chomsky agreed.[note 2]
Chomsky then prepared a manuscript of the right size (no longer than 120 pages)[note 25] that would fit the series. After revising an earlier manuscript, Chomsky sent a final version in the first week of August in 1956 to van Schooneveld.[note 26] The editor had Chomsky rename the book's title to Syntactic Structures for commercial purposes.[note 27] The book was also pre-ordered in big numbers by MIT. These gave more incentives to Mouton to publish the book. Mouton finally published Chomsky's monograph titled Syntactic Structures in the second week of February 1957.
Soon after the book's first publication, Bernard Bloch, editor of the prestigious journal Language, gave linguist Robert Benjamin Lees, a colleague of Chomsky's at MIT, the opportunity to write a review of the book. Lees's very positive[note 28] essay-length review appeared in the July–September 1957 issue of Language.[41] This early but influential review made Syntactic Structures visible on the linguistic research landscape. Shortly thereafter the book created a putative "revolution" in the discipline.[note 29] Later, some linguists began to question whether this was really a revolutionary breakthrough.[42] A critical and elaborate account is given in Chomskyan (R)evolutions.[43] Although Frederick Newmeyer states that "the publication of Syntactic structures has had profound effects, both intellectually for the study of language and sociologically for the field of linguistic",[44][45] John R. Searle, three decades after his original review, wrote that "Judged by the objectives stated in the original manifestoes, the revolution has not succeeded. Something else may have succeeded, or may eventually succeed, but the goals of the original revolution have been altered and in a sense abandoned."[46] As for LSLT, it would be 17 more years before it saw publication.[47]
Syntactic Structures was the fourth book in the Janua Linguarum series. It was the series's bestselling book. It was reprinted 13 times until 1978.[48] In 1962, a Russian translation by Konstantin Ivanovich Babisky, titled Синтакси́ческие структу́ры (Sintaksychyeskiye Struktury), was published in Moscow.[49] In 1963, Yasuo Isamu wrote a Japanese translation of the book, named Bunpō no kōzō (文法の構造).[50] In 1969, a French translation by Michel Braudeau, titled Structures Syntaxiques, was published by Éditions du Seuil in Paris.[51] In 1973, Mouton published a German translation by Klaus-Peter Lange, titled Strukturen der Syntax.[52] The book has also been translated into Korean,[53] Spanish,[54] Italian,[55] Czech,[56] Serbo-Croatian[57] and Swedish[58] languages.
In Syntactic Structures, Chomsky tries to construct a "formalized theory of linguistic structure". He places emphasis on "rigorous formulations" and "precisely constructed models".[59] In the first chapter of the book, he gives a definition of human language syntax. He then talks about the goals of syntactic study. For Chomsky, a linguist's goal is to build a grammar of a language. He defines grammar as a device which produces all the sentences of the language under study. Secondly, a linguist must find the abstract concepts beneath grammars to develop a general method. This method would help select the best possible device or grammar for any language given its corpus. Finally, a linguistic theory must give a satisfactory description of all the levels of language analysis. Examples of these levels include sounds, words and sentence structures.[60]
The second chapter is titled "The Independence of Grammar". In it, Chomsky states that a language is "a set ... of sentences, each finite in length and constructed out of a finite set of elements". A linguist should separate the "grammatical sequences" or sentences of a language from the "ungrammatical sequences".[9] By a "grammatical" sentence Chomsky means a sentence that is intuitively "acceptable to a native speaker".[9] It is a sentence pronounced with a "normal sentence intonation". It is also "recall[ed] much more quickly" and "learn[ed] much more easily".[61]
Chomsky then analyzes further about the basis of "grammaticality." He shows three ways that do not determine whether a sentence is grammatical or not. First, a grammatical sentence need not be included in a corpus. Secondly, it need not be meaningful. Finally, it does not have to be statistically probable. Chomsky shows all three points using a nonsensical sentence "Colorless green ideas sleep furiously."[3] He writes that the sentence is instinctively "grammatical" to a native English speaker. But it is not included in any known corpus at the time and is neither meaningful nor statistically probable.
Chomsky concludes that "grammar is autonomous and independent of meaning." He adds that "probabilistic models give no particular insight into some of the basic problems of syntactic structure."[4]
British linguist Marcus Tomalin stated that a version of "Colorless green ideas sleep furiously" was suggested decades earlier by Rudolf Carnap.[62][63] This German philosopher offered in 1934 the pseudo-sentence "Piroten karulieren elatisch".[64] According to American linguist Reese Heitner, Carnap's sentence showed the autonomy of both syntactic and phonological structures.[note 30]
In the third chapter titled "An Elementary Linguistic Theory", Chomsky tries to determine what sort of device or model gives an adequate account of a given set of "grammatical" sentences.[65] Chomsky hypothesizes that this device has to be finite instead of infinite. He then considers finite state grammar, a communication theoretic model[note 31] which treats language as a Markov process.[66] Then in the fourth chapter titled "Phrase Structure", he discusses phrase structure grammar, a model based on immediate constituent analysis.[67] In the fifth chapter titled "Limitations of Phrase Structure Description", he claims to show that both these models are inadequate for the purpose of linguistic description. As a solution, he introduces transformational generative grammar (TGG), "a more powerful model ... that might remedy these inadequacies."[10]
Chomsky's transformational grammar has three parts: phrase structure rules, transformational rules and morphophonemic rules.[68] The phrase structure rules are used for expanding lexical categories and for substitutions. These yield a string of morphemes. A transformational rule "operates on a given string ... with a given constituent structure and converts it into a new string with a new derived constituent structure."[8] It "may rearrange strings or may add or delete morphemes."[69] Transformational rules are of two kinds: obligatory or optional. Obligatory transformations applied on the "terminal strings" of the grammar produce the "kernel of the language".[68] Kernel sentences are simple, active, declarative and affirmative sentences. To produce passive, negative, interrogative or complex sentences, one or more optional transformation rules must be applied in a particular order to the kernel sentences. At the final stage of the grammar, morphophonemic rules convert a string of words into a string of phonemes.[69] Chomsky then applies this idea of transformational rules in the English auxiliary verb system.[70]
In Syntactic Structures, the term "transformation" was borrowed from the works of Zellig Harris. Harris was Chomsky's initial mentor. Harris used the term "transformation" to describe equivalence relations between sentences of a language. By contrast, Chomsky's used the term to describe a formal rule applied to underlying structures of sentences.[71]
Chomsky also borrowed the term "generative" from a previous work of mathematician Emil Post.[note 32] Post wanted to "mechanically [derive] inferences from an initial axiomatic sentence".[72] Chomsky applied Post's work on logical inference to describe sets of strings (sequence of letters or sounds) of a human language. When he says a finite set of rules "generate" (i.e. "recursively enumerate"[73]) the set of potentially infinite number of sentences of a particular human language, he means that they provide an explicit, structural description of those sentences.[note 33]
In the sixth chapter titled "On the Goals of Linguistic Theory", Chomsky writes that his "fundamental concern" is "the problem of justification of grammars".[10] He draws parallels between the theory of language and theories in physical sciences. He compares a finite corpus of utterances of a particular language to "observations". He likens grammatical rules to "laws" which are stated in terms of "hypothetical constructs" such as phonemes, phrases, etc.[10] According to Chomsky, the criteria for the "justification of grammars" are "external conditions of adequacy", the "condition of generality" and "simplicity". To choose the best possible grammar for a given corpus of a given language, Chomsky shows his preference for the "evaluation procedure" (which uses the aforementioned criteria). He rejects the "discovery procedure"[note 34] (employed in structural linguistics and supposed to automatically and mechanically produce the correct grammar of a language from a corpus[note 35]). He also dismisses the "decision procedure" (supposed to automatically choose the best grammar for a language from a set of competing grammars).[74] Chomsky thus shows preference for "explanatory depth" with some "empirical inadequacies" over the pursuit of very detailed empirical coverage of all data.[note 36]
In the seventh chapter titled "Some Transformations in English", Chomsky strictly applies his just-proposed transformation-based approach on some aspects of English. He treats at length the formation of English negative passive sentences, yes-no and wh- interrogative sentences, etc. He claims in the end that transformational analysis can describe "a wide variety of ... distinct phenomena" in English grammar in a "simple", "natural" and "orderly" way.[note 37]
In the eighth chapter titled "The explanatory power of linguistic theory", Chomsky writes a linguistic theory cannot content itself by just generating valid grammatical sentences. It also has to account for other structural phenomena at different levels of linguistic representation. At a certain linguistic level, there can be two items which can be understood having different meanings but they are structurally indistinguishable within that level. This is called a "constructional homonymity" [sic]. The relevant ambiguity can be resolved by establishing a higher level of linguistic analysis. At this higher level, the two items can be clearly shown having two different structural interpretations. In this way, constructional homonymities at the phonemic level can be resolved by establishing the level of morphology, and so forth. One of the motivations of establishing a distinct, higher level of linguistic analysis is, then, to explain the structural ambiguity due to the constructional homonymities at a lower level. On the other hand, each linguistic level also captures some structural similarities within the level that are not explained in lower levels. Chomsky uses this argument as well to motivate the establishment of distinct levels of linguistic analysis.[75]
Chomsky then shows that a grammar which analyzes sentences up to the phrase structure level contains many constructional homonymities at the phrase structure level where the resulting ambiguities need to be explained at a higher level. Then he shows how his newly invented "transformational level" can naturally and successfully function as that higher level. He further claims that any phrase structure grammar which cannot explain these ambiguities as successfully as transformational grammar does must be considered "inadequate".[76]
In the ninth chapter titled "Syntax and Semantics", Chomsky reminds that his analysis so far has been "completely formal and non-semantic."[77] He then offers many counterexamples to refute some common linguistic assertions about grammar's reliance on meaning. He concludes that the correspondence between meaning and grammatical form is "imperfect", "inexact" and "vague." Consequently, it is "relatively useless" to use meaning "as a basis for grammatical description".[78] To support his point, Chomsky considers a similar relation between semantics and phonology. He shows that in order to build a theory of phonemic distinction based on meaning would entail "complex", "exhaustive" and "laborious investigation" of an "immense", "vast corpus".[79] By contrast, phonemic distinctness can be easily explained in a "straightforward" way and in "completely non-semantic terms" with the help of "pair tests".[79] Chomsky also claims that a strictly formal, non-semantic framework of syntactic theory might ultimately be useful to support a parallel independent semantic theory.[note 38]
Randy Allen Harris, a specialist of the rhetoric of science, writes that Syntactic Structures "appeals calmly and insistently to a new conception" of linguistic science. He finds the book "lucid, convincing, syntactically daring, the calm voice of reason ... [speaking] directly to the imagination and ambition of the entire field." It also bridged the "rhetorical gulf" to make the message of The Logical Structure of Linguistic Theory (a highly abstract, mathematically dense, and "forbiddingly technical" work) more palatable to the wider field of linguists.[80] In a more detailed examination of the book, Harris finds Chomsky's argumentation in Syntactic Structures "multilayered and compelling". Chomsky not only makes a logical appeal (i.e. logos) to a highly formalized model of language, but also appeals explicitly and tacitly to the ethos of science.[81]
In particular, Chomsky's analysis of the complex English auxiliary verb system in Syntactic Structures had great rhetorical effect. It combined simple phrase structure rules with a simple transformational rule. This treatment was based entirely on formal simplicity. Various linguists have described it as "beautiful", "powerful", "elegant", "revealing", "insightful", "beguiling" and "ingenious".[note 39][note 40][note 41] According to American linguist Frederick Newmeyer, this particular analysis won many "supporters for Chomsky" and "immediately led to some linguists' proposing generative-transformational analysis of particular phenomena".[82] According to British linguist E. Keith Brown, "the elegance and insightfulness of this account was instantly recognized, and this was an important factor in ensuring the initial success of the transformational way of looking at syntax."[83] American linguist Mark Aronoff wrote that this "beautiful analysis and description of some very striking facts was the rhetorical weapon that drove the acceptance of [Chomsky's] theory". He added that in Chomsky's treatment of English verbs, "the convergence of theory and analysis provide a description of facts so convincing that it changed the entire field".[84]
Raymond Oenbring, a doctorate in the rhetoric of science, thinks that Chomsky "overstates the novelty" of transformational rules. He "seems to take all the credit for them" even though a version of them had already been introduced by Zellig Harris in a previous work. He writes that Chomsky himself was "cautious" to "display deference" to prevailing linguistic research. His enthusiastic followers such as Lees were, by contrast, much more "confrontational". They sought to drive a "rhetorical wedge" between Chomsky's work and that of post-Bloomfieldians (i.e. American linguists in the 1940s et 1950s), arguing that the latter does not qualify as linguistic "science".[85]
In an early review of the book, American structural linguist Charles F. Voegelin wrote that Syntactic Structures posed a fundamental challenge to the established way of doing linguistic research. He stated that it had the potential to accomplish "a Copernican revolution" within linguistics.[86] Another American linguist Martin Joos called the Chomskyan brand of linguistic theory a "heresy" within the Bloomfieldian tradition.[87] These early remarks proved to be prescient. American linguist Paul Postal commented in 1964 that most of the "syntactic conceptions prevalent in the United States" were "versions of the theory of phrase structure grammars in the sense of Chomsky".[88] By 1965, linguists were saying that Syntactic Structures had "mark[ed] an epoch",[89] had a "startling impact"[90] and created a Kuhnian "revolution".[91] British linguist John Lyons wrote in 1966 that "no work has had a greater influence upon the current linguistic theory than Chomsky's Syntactic Structures."[92] British historian of linguistics R. H. Robins wrote in 1967 that the publication of Chomsky's Syntactic Structures was "probably the most radical and important change in direction in descriptive linguistics and in linguistic theory that has taken place in recent years".[93]
Another historian of linguistics Frederick Newmeyer considers Syntactic Structures "revolutionary" for two reasons. Firstly, it showed that a formal yet non-empiricist theory of language was possible. Chomsky demonstrated this possibility in a practical sense by formally treating a fragment of English grammar. Secondly, it put syntax at the center of the theory of language. Syntax was recognized as the focal point of language production, in which a finite set of rules can produce an infinite number of sentences. Subsequently, morphology (i.e. the study of structure and formation of words) and phonology (i.e. the study of organization of sounds in languages) were relegated in importance.[94]
American linguist Norbert Hornstein wrote that before Syntactic Structures, linguistic research was overly preoccupied with creating hierarchies and categories of all observable language data. One of the "lasting contributions" of Syntactic Structures is that it shifted the linguistic research methodology to abstract, rationalist theory-making based on contacts with data, which is the "common scientific practice".[95]
The generative grammar of Syntactic Structures heralded Chomsky's mentalist perspective in linguistic analysis. Shortly after its publication, in 1959, Chomsky wrote a critical review[96] of B.F. Skinner's Verbal Behavior.[97] Skinner had presented the acquisition of human language in terms of conditioned responses to outside stimuli and reinforcement. Chomsky opposed this behaviorist model. He argued that humans produce language using separate syntactic and semantic components inside the mind. He presented the generative grammar as a coherent abstract description of this underlying psycholinguistic reality.[96] Chomsky's argument had a forceful impact on psycholinguistic research. It changed the course of the discipline in the following years.[note 5]
Syntactic Structures initiated an interdisciplinary dialog between philosophers of language and linguists. American philosopher John Searle called it a "remarkable intellectual achievement" of its time. He compared the book "to the work of Keynes or Freud". He credited it with producing not only a "revolution in linguistics", but also having a "revolutionary effect" on "philosophy and psychology".[98] Chomsky and Willard Van Orman Quine, a stridently anti-mentalistic philosopher of language, debated many times on the merit of Chomsky's linguistic theories.[99] Many philosophers supported Chomsky's idea that natural languages are innate and syntactically rule-governed. They also believed in the existence of rules in the human mind which bind meanings to utterances. The investigation of these rules started a new era in philosophical semantics.[note 42][note 43]
With its formal and logical treatment of language, Syntactic Structures also brought linguistics and the new field of computer science closer together. Computer scientist Donald Knuth (winner of the Turing Award) recounted that he read Syntactic Structures in 1961 and was influenced by it.[note 44] Chomsky's "Three models" paper (Chomsky 1956), published a year prior to the Syntactic Structures and containing many of its ideas, was crucial to the development of the theory of formal languages within computer science.[note 45]
In 2011, a group of French neuroscientists conducted research to verify if actual brain mechanisms worked in the way that Chomsky suggested in Syntactic Structures. The results suggested that specific regions of the brain handle syntactic information in an abstract way. These are independent from other brain regions that handle semantic information. Moreover, the brain analyzes not just mere strings of words, but hierarchical structures of constituents. These observations validated the theoretical claims of Chomsky in Syntactic Structures.[100]
In 2015, neuroscientists at New York University conducted experiments to verify if the human brain uses "hierarchical structure building" for processing languages. They measured the magnetic and electric activities in the brains of participants. The results showed that "[human] brains distinctly tracked three components of the phrases they heard." This "[reflected] a hierarchy in our neural processing of linguistic structures: words, phrases, and then sentences—at the same time." These results bore out Chomsky's hypothesis in Syntactic Structures of an "internal grammar mechanism" inside the brain.[101]
In his 1964 presidential address to the Linguistic Society of America, American linguist Charles Hockett considered Syntactic Structures one of "only four major breakthroughs in modern linguistics".[102][note 46] But he rapidly turned into a fierce critic of Chomskyan linguistics. By 1966, Hockett rejected "[Chomsky's] frame of reference in almost every detail".[103] In his 1968 book The State of the Art, Hockett writes that Chomsky's main fallacy is that he treats language as a formal, well-defined, stable system and proceeds from this idealized abstraction. Hockett believes such an idealization is not possible. He claims that there is no empirical evidence that our language faculty is, in reality, a well-defined underlying system. The sources that give rise to language faculty in humans, e.g. physical genetic transmission and cultural transmission, are themselves poorly defined.[note 47] Hockett also opposed Chomsky's hypothesis that syntax is completely independent of the study of meaning.[104]
Contrary to Hockett, British linguist Geoffrey Sampson thought that Chomsky's assumptions about a well-defined grammaticality are "[justified] in practice." It brought syntax "within the purview of scientific description". He considers it a "great positive contribution to the discipline".[105] However, he maintains that Chomsky's linguistics is overly "intuition-based". For him, it relies too much on native speakers' subjective introspective judgments about their own language. Consequently, language data empirically observed by impersonal third parties are given less importance.[106]
According to Sampson, Syntactic Structures largely owes its good fortune of becoming the dominant theoretical paradigm in the following years to the charisma of Chomsky's intellect. Sampson writes that there are many references in Syntactic Structures to Chomsky's own The Logical Structure of Linguistic Theory (LSLT) in matters regarding the formal underpinnings of Chomsky's approach, but LSLT was not widely available in print for decades. Nevertheless, Sampson's argument runs, Syntactic Structures, albeit "sketchy", derived its "aura of respectability" from LSLT lurking in the background. In turn, the acceptance of Chomsky's future works rested on the success of Syntactic Structures.[27] In the view of British-American linguist Geoffrey K. Pullum, Syntactic Structures boldly claims that "it is impossible, not just difficult" for finite-state devices to generate all grammatical sentences of English, and then alludes to LSLT for the "rigorous proof" of this. But in reality, LSLT does not contain a valid, convincing proof dismissing finite-state devices.[107]
Pullum also remarks that the "originality" of Syntactic Structures is "highly overstated". For him, it "does not properly credit the earlier literature on which it draws".[107] He shows in detail how the approach in Syntactic Structures goes directly back to the work of the mathematical logician Emil Post on formalizing proof. But "few linguists are aware of this, because Post's papers are not cited."[107] Pullum adds that the use of formal axiomatic systems to generate probable sentences in language in a top-down manner was first proposed by Zellig Harris in 1947, ten years before the publication of Syntactic Structures. This is downplayed in Syntactic Structures.[107]
In 1982, Pullum and another British linguist Gerald Gazdar argued that Chomsky's criticisms of context-free phrase structure grammar in Syntactic Structures are either mathematically flawed or based on incorrect assessments of the empirical data. They stated that a purely phrase structure treatment of grammar can explain linguistic phenomena better than one that uses transformations.[108][note 48]
In 2000, University of Minnesota's Center for Cognitive Sciences compiled a list of the 100 most influential works in cognitive science from the 20th century. In total, 305 scholarly works and one movie were nominated via the internet. Syntactic Structures was ranked number one on this list, marking it as the most influential work of cognitive science of the century.[note 49]
Syntactic Structures was included in The 100 Most Influential Books Ever Written, a book on intellectual history by British literary critic and biographer Martin Seymour-Smith published in 1998.[109]
Syntactic Structures was also featured in a list of 100 best English language non-fiction books since 1923 picked by the American weekly magazine Time.[2]
Nous exigeons par exemple de la théorie du langage qu'elle permettre de décrire non contradictoirement et exhaustivement non seulement tel texte français donné, mais aussi tous les textes français existant, et non seulement ceux-ci mais encore tous les textes français possibles et concevables
Grâce aux connaissances linguistiques ainsi acquises, nous pourrons construire, pour une même langue, tous les textes concevables ou théoriquement possibles.