A Companion to Chomsky. Группа авторов
even heavier responsibility now. Salon, February 11, 2017. Available at https://www.salon.com/2017/02/11/noam-chomskys-responsibility-of-intellectuals-after-50-years-its-an-even-heavier-responsibility-now/
17 Peck, J. (ed.) 1987. The Chomsky Reader. New York: Pantheon.
3 From the Origins of Government and Binding to the Current State of Minimalism1
ARTEMIS ALEXIADOU1,2, AND TERJE LOHNDAL3,4
Humboldt Universität zu Berlin
Leibniz‐Zentrum Allgemeine Sprachwissenschaft (ZAS)
NTNU Norwegian University of Science and Technology
UiT The Arctic University of Norway
3.1 Setting the Scene
Generative grammar is an approach to the study of language which is explicit, mentalistic, and based on the claim that the ability to acquire language is innately specified.2 The approach is concerned with language as a psychologically real object, whose representations can be studied scientifically. In the words of Chomsky (1975, p. 160): “Linguistics is simply that part of psychology that is concerned with one specific class of steady states, the cognitive structures that are employed in speaking and understanding.” Put differently, “[w]e have grammars in our heads” (Smith and Allott 2016, p. 128). We humans come prewired with the ability to create these grammars: We are born with a unique ability to acquire language.
Throughout its history, three questions have been at the center of this approach. They are given in (1).3
1 (1)What constitutes knowledge of language?How is knowledge of language acquired?How is knowledge of language put to use? (Chomsky 1986)
The first question seeks to establish the basis for our linguistic ability. Linguists often speak of this in terms of knowledge, but this is not the kind of knowledge that many philosophers will have in mind. Chomsky (1982, p. 128) says the following:
As I am using the term, knowledge may be unconscious and not accessible to consciousness. It may be “implicit” or “tacit.” No amount of introspection could tell us what we know, or cognize, or use certain rules or principles of grammar, or that use of language involves mental representations formed by these rules and principles. We have no privileged access to such rules and representations.4
Furthermore, generative scholars are interested in how children are able to acquire these representations for each variety or language. Lastly, an important question is how we humans utilize these representations in language use.
Since its inception, generative grammar has made at least three fundamental contributions to our understanding of language: (i) viewing grammars as formal/mathematical objects, (ii) viewing linguistics as psychology and biology: studying the emergence and structure of the mental architecture underlying language, (iii) Universal Grammar, the proposal that there is innate, mental structure which is specific to language and that enables children to acquire any language. These results have partly come about through the focus on descriptive adequacy, descriptions of the intrinsic competence of a speaker, and on explanatory adequacy, how a child acquires this intrinsic competence:5
To the extent that a linguistic theory succeeds in selecting a descriptively adequate grammar on the basis of primary linguistic data, we can say that it meets the condition of explanatory adequacy. That is, to the extent, it offers an explanation for the intuition of the native speaker on the basis of an empirical hypothesis concerning the innate predisposition of the child to develop a certain kind of theory to deal with the evidence presented to him.” (Chomsky 1965, pp. 25–26; his italics)
This enabled the study of grammars that humans have internalized, unlike, say, studying a finite corpus.
In this chapter, we will outline some of the recent history leading up to contemporary generative grammar. We will first provide some context for the emergence of Principles and Parameters, before we provide the basic gist of the Principles and Parameters approach. Then we introduce the first model that was proposed, namely Government and Binding. This is followed by a discussion of the second and to this date current model, the Minimalist Program, before we try to outline some of the current trends that shape the field of generative grammar. Lastly, we summarize and conclude the chapter.
3.2 Some Context: The Emerging Idea of Principles and Parameters
A core aspect of generative grammar in its early days was a computational system in the human mind that contained phrase structure rules for building hierarchical structures and more complex operations that were able to modify these phrase structures. The latter were known as transformations, and transformations crucially operated on structures, not, say, sentences as in Harris (1951). This gave rise to the name Transformational Grammar, which is synonymous with generative grammar. Transformations made the theory much more powerful as it allowed an infinite number of grammars (cf. Lasnik 2000, p. 114; Lasnik and Lohndal 2013, pp. 27–28), raising serious questions regarding learnability: How can a child select the correct target grammar from all available grammars? In this section, we will summarize some of the most important context leading up to the Principles and Parameters approach, which we will present in section 3. For reasons of space, the present section will have to set aside a lot of details, but see Freidin and Lasnik (2011) and Lasnik and Lohndal (2013) for a more detailed exposition.
The grammatical architecture proposed in Chomsky (1965) looks as in (2) (Chomsky 1965, pp. 135–136, cf. also Lasnik and Lohndal 2013, p. 36).
1 (2)
To give one example, we can consider simple questions. In Chomsky (1955, 1957), (3a) and (3b) had the same initial phrase structure (called phrase marker at the time).
1 (3)Ellie will solve the problem.Will Ellie solve the problem?
Transformations take the structure of (3a) and transforms it into the structure of (3b). The details are not important; readers can consult Lasnik (2000) for a lucid exposition. A remarkable success of this approach, as Lasnik (2005, 69) emphasizes, is that it enabled a unified analysis of (3) and (4).
1 (4)Ellie solved the problem.Did Ellie solve the problem?
Native speakers can sense a relationship between (3b) and (4b), but prior to Chomsky's analysis, there was no account of this. In Chomsky (1965), the technicalities were different, but the intuitions were the same: A common underlying Deep Structure as the basis for both declaratives and interrogatives, and then transformations that altered the structure into a surface structure, followed by morphophonological operations that provide the accurate forms for phonetic interpretation.
Transformations are an essential and powerful part of this architecture. Because of this, work conducted in the late 1960s and 1970s suggested a range of constraints to limit the power of transformations and consequently the range of possible grammars. An example of this is the work by Ross (1967), which proposed constraints on long‐distance dependencies (Ross labeled them islands; see Boeckx 2013, den Dikken and Lahne 2013 and Müller, Chapter