[Note: this page no longer fully and accurately reflects my thinking about the field; for a more recent statement, see the proposal to the new MLA Discussion Group on Cognitive Approaches to Literature. 10/10/98. For an update on the epistemological position, see Cognitive Constructivism.]
We are privileged to be living in the middle of a cognitive revolution, led by linguists, cognitive scientists, neurobiologists, and evolutionary psychologists. Their concerted efforts are beginning to formulate a vision of human cognition increasingly removed from the practice and ideology of the humanities: we are falling asleep at the spinning wheel while the shuttlecocks of other disciplines are busily reweaving our subject matter. This is a situation that cannot be indefinitely sustained; at some point, the humanities will have to engage with the emerging scientific understanding of the human.
Work in linguistics in the early part of this century provided a significant inspiration for literary studies. Ferdinand de Saussure provided the basis for a theory of linguistic meaning that remains a stable point of reference, and Roman Jakobsen's work on binary oppositions in phonetics gave Claude Levy-Strauss the impulse to develop his broadly influential structuralist approach to culture. This interdisciplinary cooperation was fruitfully sustained for several decades.
The more recent work, however, which can be dated back to Chomsky's (1959) critical review of B.F. Skinner's Verbal Behavior, has gone largely unnoticed, and has had only a peripheral effect on the discipline. Yet this work has led to dramatically new ways of understanding language and the mind. Chomsky refers to the changes over the past forty years as "the second cognitive revolution", the first being the materialist approach to nature--inspired by the construction of automata--led by Descartes, later attempted extended to include the mental by Locke, Hobbes, and Hume. The second cognitive revolution is characterized by the computational approach, though it is becoming clear that the analogy of the computer is of limited relevance.
What Chomsky drew attention to is that children do not simply imitate adult speech: they are creative, and produce sentences they have never heard. This is possible because they use generative rules; however, they haven't heard enough sentences to deduce those rules from scratch (the "poverty of stimulus" argument). Chomsky concluded there must be an innate grammar--in Steven Pinker's term, a 'language instinct'--mediated by a specific area of the brain. Since adoptions and immigrations have shown that any child can learn any language, this 'language organ' is as characteristic of human beings as a heart or a hand--a claim that is beginning to be substantiated by neurological evidence. Plato appears to have been essentially right, and Locke's tabula rasa too much of a simplification: learning is a kind of recollecting, and what is being recollected is the adaptive history of mankind. "It seems that the brain is like any other known biological systems", Chomsky (1996) summarizes: "modular, constituted of highly specialized subsystems that have their particular character and domains of operation, interacting in all sorts of ways" (20).
The syntactic approach is sometimes criticized for speaking as if the mind were not situated in the world--in a word, for disregarding semantics. Semantics raises the question of categorization, a central task of cognition, which may be framed as a question of the relation between words and things. Recent fruitful work in semantics may be traced back to Wittgenstein's (1953) realization that the assumption of such a relation cannot be taken for granted. Rather than finding Aristotle's "necessary and sufficent" condition for class membership (and thus linguistic meaning), Wittgenstein found a network of family resemblances. this thread was picked up by Rosch (1973, 1981), whose innovative work on categorization suggests they are based on prototypes or typical members rather than on membership conditions. Prototypicality theory is further developed by Lakoff (1987), in the direction of conceptual domains.
The syntactic and the semantic approaches to linguistics have thus by different routes, if not independently, arrived at the notion of cognitive domains. There is, however, vigorous debate about the question of domain specificity. Are domains culturally (experientially) constructed on the basis of a domain-general cognitive faculty, or is there a distinct faculty for each domain? At the input level, there are clearly distinct faculties: the eye, the ear, the vomeronasal organ, and the taste buds are specialized information gathering and processing devices, with no other physiological purpose. Fodor (1983) argues convincingly that perceptual processes--what we might think of as the path from the surface organs to the site of phenomenological awareness--are carried out by specialized mechanisms or 'modules' that have a range of ideal characteristics: "domain-specific, innately specified, hardwired, autonomous" (36), and "informationally encapsulated" (64)--that is, they do not take input from conceptual processes. Fodor includes language comprehension among his 'input' modules; this comprises a phonological parser that analyzes speech into phonemes (Shaywitz 99), and a Saussurean lexical matrix for accessing meaning. He denies that "central" conceptual processes are modular; however, the line is awkwardly drawn: why should language be considered any more of an input module than, say, our intuitive notions of contact mechanics?
The evidence indicates that complex acts of classification and entailment prior to conscious awareness take place in a range of distinct domains (Tooby and Cosmides (1992), 97-100). The work of Atran (1990), Spelke (1988), Leslie (1988) and others document the existence of cognitive domains in rigid-object mechanics, biology, and psychology. Only some of these researchers (e.g. Baron-Cohen (1995)) assume that the faculty is associated with a specific neural archtechture . Sperber (1994) argues that evolution is likely to bring about cognitive modules that solve specific adaptive problems; there are no computational or organic solutions to 'problems in general'.
A related argument may be found in hermeneutics; Gadamer (1975) points out that assumptions are not obstacles to interpretation, but enable it. Similarly, evolved mental structures are not constraints on cognition, but its condition of possibility. It would clearly be absurd, for instance, to claim that the phonological parser, which operates automatically and without conscious awareness on speech input, constitutes a biological constraint that we would be better without.
Saussure's view that language is a system of negative difference is thus erroneous; language on the contrary is bristling with differences that stem from underlying psychological and conceptual structures. The initial attraction of the Saussurean view is that it appears superficially to be empowering: if there are no positive definitions, the creativity of language is endless. However, as research in several independent disciplines have shown, initial assumptions are vital for any act of congnition. The postmodern fallacy that less structure means more freedom relies on a folk-psychological concept of constraint: if one examines linguistic creativity, it becomes immediately clear that it is made possible only by building on pan-human cognitive structures and assumptions.
On the other hand, the postmodern resistence to a total theory, a grand recit, finds a correlate in the emerging theory of mind. One consequence of the second cognitive revolution is that the field of knowledge is marked into territories that are not necessarily compatible: since concepts and words do not represent the world in itself--a Kantian insight that still stands--the ontology of the various domains cannot be assumed to be reducible to one, or even ranked according to their degree of reality or truthfulness (cf. Hernadi (1995). Various cognitive faculties have evolved to define their own domains and deal with their particular problems, and there is no super-faculty with a direct or non-representational access to reality. What the postmodern approach is missing is precisely the inadequacy of intuitive concepts: what is required is to question the concepts of matter and biology as only determining rather than also crucially enabling systems.
The four dimensions of the cognitive revolution--the linguistic (leading to notions of conceptual domains), the computational (leading to notions of cognitive structues), the adaptationist (leading to notions of which problems the structure is designed to solve), and the neurological (leading to notions of embodied structures)--provide ample evidence that cognition is highly structured. What are the consequences of this for the study of literature?
Of particular interest in the study of literature is an understanding of the differences between what Sperber (1994) calls the 'proper' domain of a cognitive faculty (the domain it evolved to deal with) and the characteristics of the actual domain it is acting on. Lakoff (1987) and Turner (1991)'s work on metaphor as a transfer of the characteristics of one domain onto another can be vastly extended, and throw significant light on social and historical events. If we do not have a clear concept of the proper domain, the metaphorical implications--what Lakoff refers to as "preconceptual structures"--will be unclear.
The link with literature is thus most obvious in the case of linguistics; however, linguistics is enabled by cognitive structures (cognitve science) that belong to a proper domain (evolutionary psychology). In order to understand literature and metaphor, you need a knowledge of linguistic and conceptual domains, which are enabled by cognitive structures that belong to a proper domain. The neurological aspect is interesting as a test case: literature should make the brain light up in many places, drawing on a variety of conceptual domains and cognitive structures.
The misuse of cognitive faculty theory would be to assume that faculties uniquely and linearly determine and explain human behavior. The whole of human history represents the proven minimum of possible human behavior; we can have no knowledge of what the maximum is. In fact, we can be confident that there is no such maximum, since the combinatorial possibilities even of the few faculties we currently know of are endless, and their response to new domains cannot be predicted; add to that what we do not know.
November 10, 1996
Cognitive Approaches to Literature