A Companion to Chomsky. Группа авторов

A Companion to Chomsky - Группа авторов


Скачать книгу
even heavier responsibility now. Salon, February 11, 2017. Available at https://www.salon.com/2017/02/11/noam-chomskys-responsibility-of-intellectuals-after-50-years-its-an-even-heavier-responsibility-now/

      17 Peck, J. (ed.) 1987. The Chomsky Reader. New York: Pantheon.

Part I Historical Development of Linguistics

      ARTEMIS ALEXIADOU1,2, AND TERJE LOHNDAL3,4

      Humboldt Universität zu Berlin

      Leibniz‐Zentrum Allgemeine Sprachwissenschaft (ZAS)

      NTNU Norwegian University of Science and Technology

      UiT The Arctic University of Norway

      1 (1)What constitutes knowledge of language?How is knowledge of language acquired?How is knowledge of language put to use? (Chomsky 1986)

      The first question seeks to establish the basis for our linguistic ability. Linguists often speak of this in terms of knowledge, but this is not the kind of knowledge that many philosophers will have in mind. Chomsky (1982, p. 128) says the following:

      To the extent that a linguistic theory succeeds in selecting a descriptively adequate grammar on the basis of primary linguistic data, we can say that it meets the condition of explanatory adequacy. That is, to the extent, it offers an explanation for the intuition of the native speaker on the basis of an empirical hypothesis concerning the innate predisposition of the child to develop a certain kind of theory to deal with the evidence presented to him.” (Chomsky 1965, pp. 25–26; his italics)

      This enabled the study of grammars that humans have internalized, unlike, say, studying a finite corpus.

      In this chapter, we will outline some of the recent history leading up to contemporary generative grammar. We will first provide some context for the emergence of Principles and Parameters, before we provide the basic gist of the Principles and Parameters approach. Then we introduce the first model that was proposed, namely Government and Binding. This is followed by a discussion of the second and to this date current model, the Minimalist Program, before we try to outline some of the current trends that shape the field of generative grammar. Lastly, we summarize and conclude the chapter.

      A core aspect of generative grammar in its early days was a computational system in the human mind that contained phrase structure rules for building hierarchical structures and more complex operations that were able to modify these phrase structures. The latter were known as transformations, and transformations crucially operated on structures, not, say, sentences as in Harris (1951). This gave rise to the name Transformational Grammar, which is synonymous with generative grammar. Transformations made the theory much more powerful as it allowed an infinite number of grammars (cf. Lasnik 2000, p. 114; Lasnik and Lohndal 2013, pp. 27–28), raising serious questions regarding learnability: How can a child select the correct target grammar from all available grammars? In this section, we will summarize some of the most important context leading up to the Principles and Parameters approach, which we will present in section 3. For reasons of space, the present section will have to set aside a lot of details, but see Freidin and Lasnik (2011) and Lasnik and Lohndal (2013) for a more detailed exposition.

      The grammatical architecture proposed in Chomsky (1965) looks as in (2) (Chomsky 1965, pp. 135–136, cf. also Lasnik and Lohndal 2013, p. 36).

      1 (2)

      1 (3)Ellie will solve the problem.Will Ellie solve the problem?

      Transformations take the structure of (3a) and transforms it into the structure of (3b). The details are not important; readers can consult Lasnik (2000) for a lucid exposition. A remarkable success of this approach, as Lasnik (2005, 69) emphasizes, is that it enabled a unified analysis of (3) and (4).

      1 (4)Ellie solved the problem.Did Ellie solve the problem?

      Native speakers can sense a relationship between (3b) and (4b), but prior to Chomsky's analysis, there was no account of this. In Chomsky (1965), the technicalities were different, but the intuitions were the same: A common underlying Deep Structure as the basis for both declaratives and interrogatives, and then transformations that altered the structure into a surface structure, followed by morphophonological operations that provide the accurate forms for phonetic interpretation.

      Transformations are an essential and powerful part of this architecture. Because of this, work conducted in the late 1960s and 1970s suggested a range of constraints to limit the power of transformations and consequently the range of possible grammars. An example of this is the work by Ross (1967), which proposed constraints on long‐distance dependencies (Ross labeled them islands; see Boeckx 2013, den Dikken and Lahne 2013 and Müller, Chapter


Скачать книгу