Joan L. BybeeProfessor and Chair of LinguisticsUniversity of New Mexico
|
|
The Emergent LexiconJoan Bybee University of New Mexico
1. EmergenceIn the traditional view, the lexicon is a storage area for all and only the content words or morphemes of a language. The lexicon is relatively static compared to the grammar, which contains all the moving parts of sentence generation: in the metaphor of a dictionary, lexical items are just passive items on a list which wait to be recruited into syntactic structures. In this conception also, memory for linguistic material is thought to be lodged primarily in the lexicon while the grammar is not so much a matter of memory as it is of abstract structure.Rather than arguing that a lexicon of this type does not exist, I am going to argue that if such a lexicon does exist it is because it is emergent from the storage of linguistic experience, which is of a very different nature than the traditional conception of a lexicon would suggest. The point of this paper, then, is to explore the properties of stored linguistic experience. The data examined provide support for the proposals that much of linguistic knowledge is procedural knowledge, that chunks of linguistic experience much larger than the analytic units of morphemes or even words are the usual units of storage and processing, that there is no real separation of lexicon from grammar, and that phonological alternations whose domain is larger than a word can indicate the size of processing units.This paper obviously owes a great deal to the 1987 paper of Paul Hopper's to which the title alludes: "Emergent Grammar". In fact, my basic point is the same as the one Hopper made in that paper: that the 'knowledge' underlying the fluent use of language is not grammar in the sense of abstract structure, but is rather is a large store of categorized and sorted previous utterances which form the basis of the production and comprehension of new utterances. The only difference between Hopper's approach and mine is that his goal was the explication of the view of grammar that arises from such a theory and mine will be the view of the lexicon that is entailed by this theory. Since we would both argue that the grammar and lexicon are not separable, our papers are both about really about the same set of issues.2. The lexicon reflects linguistic experienceI have argued in various places (Bybee 1985, 1988, 1998a and 1998b), especially in connection with morphology, that lexical storage is highly affected by language use. We know that highly frequent morphological formations, such as irregular nouns and verbs, tend to resist leveling and maintain their irregularities over time. I have called the mechanism behind this tendency 'lexical strength'. This notion corresponds to the psycholinguistic notion that high frequency items have a higher level of resting activation; being easier to access, they are less likely to be replaced by regular formations.In other avenues of research, I have looked into the role of frequency of use in the phonological and semantic reduction of words and phrases. In a number of cases, documentation on sound change in progress shows a significant frequency effect: high frequency words undergo sound change at a faster rate than low frequency words, for t/d-deletion in American English (Bybee 1998b), /D/-deletion in New Mexican Spanish, and many other examples (see Phillips 1984). Furthermore, all of these changes are phonetically gradual. My account of the frequency effect in the diffusion of sound change is as follows: Sound change has its source in the reduction and retiming of articulatory gestures that affect words or phrases as they are used in context. The memory representation of the phonetic shape of a word is a categorization of the tokens of use that have been experienced and thus represents a range of variation (Miller 1994). Thus each use of a word or phrase has an effect on the stored representation. Words that are used more often in contexts where reduction and change are favored (familiar speech situations, high frequency phrases) will undergo a gradual shift in phonetic representation at a faster rate than other words.Both of the frequency effects I have just described imply that it is actual tokens of use that are stored in memory, and not smaller units such as bound morphemes, which do not occur as tokens of use. I have argued in Bybee (1985) that the internal structure of words is derivable from sets of connections made between words that have related parts. Affixes and roots or stems have no separate representation, but exist only as relations of similarity among words. (See Figure 1.)Recurrent patterns such as those shown in Figure 1 are the emergent generalizations or schemas that can be used to produce new combinations. It has been shown for morphology that one important determinant of productivity is the type frequency of a pattern: that is, the greater the number of distinct stems a pattern applies to, the greater is the likelihood that it will apply to new items (MacWhinney 1978, Bybee 1985, 1995, Lobben 1991). Thus some schemas will be very strong and easily accessible for applying to a great many items, and others will be much less so.The model described here, which I will call the Network Model, is highly redundant since the same string of features, that is, the same morpheme or word, can occur in many different combinations. Redundant representation in this model does not entail that any valid generalizations are being missed: it is, of course, an empirical question what type of generalizations native speakers make, but all of these can be captured in schemas which can be formulated in varying degrees of abstraction. Moreover, redundant representation does not entail that all the potential words of a language exist in storage. Not all words have the same status: those that are used frequently have very strong representations, or even possibly multiple representations, but those that are used infrequently, or are potential but have not been used, may have no representation.(I apologize that I am not yet able to get the figures in the web-page format) 3. Storage and processing of larger unitsIn the following I will extend the model of morphology that I have just reviewed to larger units, arguing that evidence for it is also found in the behavior of phrases and even syntactic constructions. My proposal is that memory for language consists of a large store of processing units of varying sizes (from word to phrase or even clause) with varying degrees of strength, productivity and connection with other processing units. Moreover, I view linguistic knowledge as largely procedural knowledge built up through exposure to production and perception and deeply embedded in the motor, physical and perceptual domains (Anderson 1993, Boyland 1997). The propositional knowledge we often associate with linguistic form (the content of the traditional lexicon) is emergent from the categorization of our procedural knowledge of language.To argue for this view, I will consider some cases in which evidence exists for memory storage of strings consisting of more than one word, in particular, idioms, high frequency phrases and grammaticizing constructions.3.1. IdiomsIt is not controversial to claim that idioms are instances of multi-word sequences that are stored in memory. It is interesting, however, to observe the similarities of idioms to multi-morphemic units that are lexically stored and to further observe that the network model just presented solves some of the problems in the analysis of idioms.First observe that idioms often contain conservative lexical items and conservative grammatical usage. Examples of words that appear only in idioms are familiar from the literature: bated occurs only in wait with bated breath, dint occurs only in by dint of, hale occurs only in hale and hearty. Each of these phrases preserves a word that has been lost elsewhere from the language. This preservation is fitting evidence that the whole phrase is a storage item in memory.Such cases are parallel to the morphological cases in which otherwise obsolete affixes are locked into certain formations, as in the old -er plural marker which occurs as the r in children. In inflectional forms, such items occur only in very high frequency formations; in derivation, extreme high frequency does not seem to be a prerequisite for the preservation of otherwise non-productive or obsolete affixes: in the words, handsome, steadfast, piecemeal, the second half was formerly a derivational affix. Similarly, idioms do not have to be of especially high frequency to preserve archaic words. A check of Francis and Kuc#era (1978) revealed no occurrences of bated, dint or hale, suggesting that the phrases they occur in are not frequently used.Idioms and set phrases can also preserve obsolete grammar. Observe the SOV word order in the phrase, with this ring I thee wed, or as Hopper points out, the older sense of the indefinite article to mean 'one and the same' in birds of a feather or its use to mean 'one' in a penny saved is a penny earned. Thus evidence that idioms, like irregular morphological formations, are stored as units is the fact that they can preserve otherwise obsolete lexical items as well as archaic grammar.Just because idioms are remembered as wholes does not mean that their component parts and the semantic contribution they make are not recognized. Nunberg, Sag and Wasow (1994) have pointed out that even though a phrase such as pull strings, as in John was able to pull strings to get the job, has a meaning that is different from the literal combinations of its parts, speakers recognize the two words in the phrase as the same as those occurring in other combinations, and even recognize the semantic contribution of these words. Nunberg et al. point out that strings can be thought of metaphorically as connections with certain (influential) people and pull can refer to the exploitation or the exertion of pressure on these strings, a verb-object relation. Such an idiom (or 'idiomatically combining expression' as Nunberg et al. call it) is not frozen and unanalyzed, but connected both lexically and grammatically to other expressions in English. Such connections can be diagrammed as in Figure 2.(I apologize that I am not yet able to get the figures in the web-page format) 3.2. Frequent phrasesIt is also well-known that high frequency phrases change phonetically and semantically in a way that suggests storage as a single unit. In a recent study of the reduction of don't in English conversation, Bybee and Scheibman (to appear) and Scheibman (1997) found that the reduction of the [o] vowel of don't to schwa occurs only in the contexts in which don't most frequently occurs, i.e. after I and before certain verbs. Scheibman also shows that the use of I don't know in conversation serves certain pragmatic purposes and does not literally mean I - don't - know. Both the phonetic change and the functional change suggest that I don't know ([ai|«|&o]) is a storage and processing unit. Similarly, I don't think and why don't you, are other phrases using don't that are phonetically reduced and have a meaning that is not the literal combinations of their parts.This means that don't, which people do still identify as a word, occurs in at least three combinations in storage, the phrases I don't know, I don't think, and why don't you (verb). But we can't stop there because don't also occurs with its vowel reduced to schwa in other phrases, such as I don't mean, I don't care, and I don't have, suggesting that these phrases are also storage and processing units. In fact, since don't in our corpus only occurs in the reduced form with schwa after I, it might be that I don't is also a processing unit as other combinations of pronoun with auxiliary undoubtedly are, for instance, I'm, I'd, he's, she's, we've and other contracted forms (Krug 1998). In fact, in a count of 414 tokens of I in a conversational corpus, the two most frequent items to follow I were 'm and don't, each one accounting for more than 10% of the tokens. The efficiency of processing I'm and I don't as units justifies their representation in storage, but it does not rule out the possibility that the sequences I am and I don't can be compiled from individual units. In fact, such on-line concatenation probably occurs in the production of the these units in their full, unreduced forms. These means that I, don't, am, I don't, I'm, I don't know and many other phrases are stored in memory and are accessible in production and comprehension. (See Figure 3.)3.3. Grammaticizing constructionsIn its reduction and change in meaning the phrase I don't know resembles a grammaticizing constructionit is undergoing both phonological reduction and functional change that is typical of grammaticization. Boyland (1997) argues that grammaticization is the automatization of a processing unit. The frequency increase characteristic of grammaticizing constructions and the changes conditioned by frequency point to grammaticizing constructions as storage and processing units. Indeed, one fact that was often overlooked earlier in grammaticization studies is that grammaticization takes place in particular constructions. On a synchronic level it is a fact that grammatical morphemes are always associated with particular constructions. There is no reason to suppose that grammatical morphemes have any independent existence since they always occur in and derive their meaning from a specific construction. Thus grammaticization occurs when a new construction or a specific instance of an old construction becomes a processing and storage unit.(I apologize that I am not yet able to get the figures in the web-page format) For instance, it is often said that English has a go-future, or from a diachronic perspective, that a movement verb has become a future marker in English. But in fact, it is not go alone that has grammaticized. As is often pointed out, the construction involves the Present Progressive of go plus the goal-marker to, and just as important, a subject and a main verb. Thus be going to or gonna (with the meaning of future) exists only in the construction seen in (1):
(1) [Subject + BE] gonna [Verb]
This sequence is not just a construction, but is a processing unit that produces a clause when specific lexical material is supplied for the more open slots in the construction.4. Constructions as processing unitsCroft (1995) analyzes almost 2000 previously-coded intonation units from Chafe's pear stories and finds that in 97% of the cases intonation units are grammatical units, the most frequent of these being simple clauses and the next most frequent noun phrases. Croft proposes The Intonation Unit Storage Hypothesis:The constructions that are stored or precompiled are the grammatical units that (normally) occur in a single intonation unit. (p. 872)He goes on to say that "singly embedded NPs/PPs are almost certainly stored / precompiled constructions. Syntactically, they are relatively simple structures, almost never break across IU s and are fairly frequent" (872). Croft's usage-based approach makes the very plausible suggestion that "the units employed for spoken communication are basically the units stored as constructions in the mind." (872-3). (See also Ono and Thompson 1995 for a similar claim.)Similarly, Hopper (1987) and Langacker (1987:35-36) note the highly formulaic nature of actual speech and the frequent occurrence of certain phrases such as the problem is , you take , a little bit, living in a fantasy world, one question after another. Hopper argues that the systematicity of linguistic systems is due to the "lateral associations of real utterances". That is, real pieces of speech are stored, sorted, categorized for both phonological and functional similarity, but "they do not, however, merge into the kind of uniform grammar which would lead one to posit a uniform mental representation to subtend them" (p. 147).Constructions are not, of course, set phrases but rather abstractions that range over many specific phrases. The slots in a construction are of varying degrees of generality, the different parts of constructions are productive to varying degrees. For example the way-construction, as in we made our way home (Kemmer 1994, Goldberg 1995), has four parts, each having a different range of possible lexical or grammatical material.
(2) [verb] [possessive pronoun] way ([directional adverb])
Each of these positions may contain the following material (examples from Kemmer 1994):
1. The verb is restricted to one which signifies motion, manner, path creation, or means: went his way, swiggled his way,made our way, cut their way.2. The possessive pronoun is a closed grammatical class.
3. way is a unique lexical item which is not replaceable
4. The adverbial is usually a prepositional phrase although it may include other elements: tooting its way through London, singing its way down from the heights.In fact, most constructions contain some specific lexical or grammatical material. Here are some of the constructions Ono and Thompson (1995) extracted from seven 5-10 minute conversations:
(3) construction example
like to Verbal Expression I'd like to haveto have NP V-ed to have my lungs replacedNP replaced with NP replaced with asbestosNP or something asbestos or somethingNP do a N on NP they did a post mortem on herNP say CLAUSE the doctor said there wasn't anyget off ONE's NP (rear body part) get off my ass
The point to note in these examples is that very specific lexical and grammatical material is essential to the construction. In addition note that certain tokens of constructions, with specific lexical material are clearly stored units, due to their frequency. For instance, the prototype make one's way could be a stored unit, in addition to wend one's way, which must be stored as a unit because wend does not occur in other constructions.Ono and Thompson also propose that the following very generalized constructional schemas exist for producing clauses in English:
(4) NP V NPNP V NP PPNP V NP PP PP
Such highly generalized constructional schemas would arise from the representations of much more specific stored units. In my own view, the evidence for the more specific schemas is much stronger than for the very abstract ones proposed in (4).The fact that lexical and grammatical material is embedded in the construction in which it occurs explains how lexical and grammatical splits occur. For instance, what was originally a single verb have is now used in many different constructions and phrases, indicating that it has multiple representations. Phonological evidence suggests that the more lexical uses of have in phrases have split from the auxiliary have of the Perfect construction, since the latter contracts with the subject (I've done it) while the former does not (*I've ten dollars).These facts all point to the need to investigate the hypothesis that production involves the accessing, concatenation, and overlapping of distinct stretches of speech that have been formed into processing units by repetition. A possible further argument for this hypothesis arises from another observation made by Croft (1995) and others who have studied intonation units. Croft observes that consecutive parallel structures are in different intonation units; that is, conjoined clauses, conjoined verb phrases and noun phrases in apposition are almost always in different intonation units. A possible interpretation of this finding would follow from the proposal that constructions are processing units; in that case, each use of the same construction would require beginning a new processing unit. Embedded clauses, then, would occur in the same construction as the main clause, since no intonation break necessarily occurs between the main and subordinate clause. Clearly, the relations among intonation units, processing units and construction presents a potentially fruitful area of investigation.5. The intertwining of 'lexicon' and 'grammar'Many arguments have been developed over the last few decades for a strong interdependence between lexicon and grammar. Langacker (1987) takes the position that there is no discrete cut off point between lexicon and grammar, as do many other linguists. The basic observation is that lexical items, in particular verbs, must contain a lot of information about the syntactic contexts in which they occur (e.g. Levin 1993, Goldberg 1995). From the other perspective, it is also the case that syntactic constructions often contain very specific lexical material (as shown above) or semantically well-defined classes of lexical items (as shown in Goldberg [1995] and other works on construction grammar). Rather than reviewing these arguments here, I would like to add two other pieces of evidence that support the hypothesis that lexicon and grammar are not separate--psycholinguistic and sociolinguistic evidence of syntactic priming and phonological evidence of alternations that apply across word boundaries.5.1. Priming of constructionsA well-known effect in lexical access is the priming effect. In word recognition tasks, the second instance of the same word is accessed much more quickly by a subject than the first instance. The general explanation for this phenomenon is that words are activated as they are used and their activation remains high for a short period after use, making them easier to access during that period. An important argument for viewing words and constructions as stored in memory and accessed in similar ways is the fact that constructions also show the priming effect (Branigan, Pickering, Liversedge, Stewart and Urbach 1995)Bock (1986) presented subjects with either one of the sentences in (5), which they were asked to repeat. Then they were shown pictures that could be described using either a prepositional-object or a double-object construction. For example, the picture might show a girl handing a paintbrush to a man. Bock found that subjects tended to produce a sentence with the same structure as the prime.(5) a. The rock star sold some cocaine to an undercover agent.b. The rock star sold an undercover agent some cocaine.Since other experiments have shown that priming occurs between comprehension and production, Branigan et al. conclude that both production and comprehension access the syntactic information associated with the construction, thereby activating it, much as a lexical item is activated in comprehension and production.Interestingly, this sort of priming effect has been demonstrated to occur in natural discourse as well. Many sociolinguistic studies of variation have turned up the interesting fact that one of the best predictors of the use of a construction is its use in immediately preceding discourse. For instance, Poplack and Tagliamonte (1996) find that in Nigerian Pidgin English, where all tense/aspect forms are optional, the strongest predictor of the occurrence of a tense/aspect marker with a verb is an immediately preceding verb with the same marker.Weiner and Labov (1983) studied the use of the agentless passive in American English conversation, looking at all the sociolinguistic and pragmatic factors that might influence its use. Their data showed that one of the most powerful factors in the choice of the agentless passive was its use in one of the five preceding clauses and this factor was independent of pragmatic factors such as what was given or new information. These findings are consistent with the idea that access to a construction in comprehension or production activates that construction and makes it available for subsequent use.Tannen (1989, among others) discusses the pervasive use of repetition in spontaneous conversation. She sees an advantage in the use of repetition for both production and comprehension. She argues that "repetition enables a speaker to produce language in a more efficient, less energy-draining way. It facilitates the production of more language, more fluently" (p. 48). She does not mention the psycholinguistic factor behind this facilitation, but clearly it is the priming factor. It is simply easier to access words, phrases and constructions that have recently been accessed. Her argument from the point of view of the hearer is similar: repetitions and variations (inserting words into just-used constructions) "facilitate comprehension by providing semantically less dense discourse" (p. 49). Again, this facilitation is due to the priming effect: recently activated material is easier to access again5.2. Phonological effects within constructionsIt is well-known that most phonological or morpho-phonological alternations occur within words. In particular, most frozen or non-productive alternations (such as Vowel Shift, as in divine~divinity, or in Velar Softening, as in critic~criticize) occur at the word level. That is why the few cases of frozen alternations that appear to be conditioned across word boundaries have attracted so much attention. Much research has been directed toward the syntax/phonology interface with an eye to discovering which syntactic configurations condition phonological alternations. Of course, some alternations of this sort are rather easy to describe, for instance, the alternation in the indefinite article in English, while others, such as the reduction of don't or contraction in English, are much more difficult.Another way of viewing the alternations that appear to take place across word boundaries is to suppose that the units in which such alternations occur are not composed of separate lexical units, but rather are themselves unitary with respect to memory storage (Bybee 1998a). That is, as I mentioned earlier, the reduction of don't occurs only in fixed high frequency phrases involving don't. The persistence of the alternation in a/an suggests that this small grammatical morpheme is not an independent lexical unit, but rather a part of a construction: a/an (Adj) Noun. Many stored units contain the indefinite article (an apple, an hour, an instant, a moment, a day, a friend) and a more schematic construction may also exist which includes the phonological information about the variants of a/an.Other have also argued that the non-automatic rules of phonology that appear to refer to syntax occur only in precompiled phrases (Hayes 1990, Zwicky 1987). As evidence we note that by far the most common cases of allegedly syntax-sensitive alternations involve particular grammatical morphemes. Since grammatical morphemes exist only in constructions, these alternations exist within constructions, that is, storage units, just as word-level alternations exist within storage units. Probably the most famous case of alternations conditioned across word boundaries is the complex set of alternations in French liaison (Tranel 1981). All of these alternations involve grammatical morphemes in specific constructions: (6) demonstrates liaison codified in the orthography in questions; (7) shows that the final consonant of certain prepositions appears when the following word begins with a vowel; (8) shows the appearance of [z] in articles and the plural marker for nouns and adjectives when the following word is vowel-initial. The liaison contexts which involve classes of lexical items affect only small closed classes of adjectives (those that can occur pre-nominally) and a few adverbs (see [9]).
(6) chante-t-il? [s#a&ttil] 'does he sing?'cf. il chante [il# s# a& t] 'he sings'
allons-y? [alO&zi] 'let's go'cf. allons [alO & ] '(we) go'
(7) prepositions:
dans un mois [da&zE&mwa] 'in a month'cf. dans trois mois [da& tå wamwa] 'within three months'
pendant un mois [pa&da&tE&mwa] 'for a month'cf. pendant trois mois [pa& da& tå wamwa] 'for three months'(8) plural of articles, adjectives and nouns:les enfants [lEza&fa&] 'the children'les petits animaux [lEptizanimo] 'the little animals'
(9) or small closed classes, such as the pre-nominal adjectives (petit, grande, etc.) andadverbs: assez 'enough', trop 'too', etc.
It is my view, then, that fixed 'lexical' alternations occur only within storage and processing units and could not be maintained if they were indeed applying across boundaries between processing units. Also, the more phonetic, variable changes that occur within phrases, such as the reduction of don't discussed above, or the palatalization in phrases such as did you, occur only in frequent phrases. Both types of phonological alternations that appear to be conditioned across traditional word boundaries actually provide evidence for the size and nature of processing units and point to constructions rather than words as the minimal unit for storage and processing. The traditional studies of such cases, under the rubric of the 'phonology-syntax interface' have assumed that the proposed syntactic structure is correct and the phonology is in some way aberrant, not following established syntactic conventions in some cases. However, I would propose that the phonology is providing a more direct indication of the true nature of the processing units, and thus, the constituents, involved.6. Native-like selection and native-like fluencyPawley and Syder (1983) discuss the fact that the rules of grammar generate many more utterances than would be considered idiomatic or native-like (see also Pawley 1986, Langacker 1987:35-36). For example, they suggest that instead of I want to marry you, one could grammatically utter some of the following, though the effect would be decidedly non-idiomatic:
(10) I wish to be wedded to you.I desire you to become married to me.Your marrying me is desired by me .
Rather than exploiting the full range of grammatical possibilities, we have certain standardized or conventionalized ways of saying things, not just for specific contexts, such as telling time, where we say it's twenty to/til six rather than it's six less twenty, but in almost every context or almost every utterance we produce.We all know that in learning a language, learning the vocabulary and rules of grammar is not sufficient preparation for actually using the language for everyday activities. To sound like a native, one must learn a large set of stock phrases. Pawley and Syder argue that actual language production consists of accessing many clauses already, or at least largely, pre-formed. They estimate that the stock of such phrases in the native speaker may range into the hundreds of thousands.Being able to access such automated sequences is essential for fluency in both the native and the non-native speaker. Actual language seems to rely on memory much more than on abstract analysis. A significant implication for language acquisition is that along with categorization and generalization as mechanisms for acquisition, there is also an important role for imitation. One must learn, by example, in context, what are the customary ways of formulating one's ideas, requests, questions, and so on. Moreover, the process of automating such sequences requires repetition, the active repetition of production.None of these points comes as a surprise to anyone who has ever observed a child acquiring a language, nor to anyone who has ever tried to learn or teach a language. However, they are points that have up to now, which the notable exception of the work of Pawley and Syder which I have just cited, been regarded as having practical but not theoretical consequences. Recognizing their general theoretical significance may result in a greater focus on their practical consequences. Clearly the question for both general theory and acquisition theory is how the particular conventionalized phrases and clauses give rise to the abstract generalizations that allow the production of more novel utterances.7. ConclusionsClearly, all words, phrases and utterances of a person's experience are not separately stored in memory! The brain is a powerful categorization device for the efficient sorting and storing of the pieces of our experience, including the units of language use. One type of efficiency is achieved by storing and processing larger chunks rather than smaller ones. It is apparently easier to access, produce and comprehend a precompiled chunk than to assemble it part by part for production. (See Anderson 1993, Boyland 1997). However, this task would be beyond our powerful brains were we not also able to categorize these chunks and generalize over them.Linguistic knowledge is not just propositional or representational knowledge. A large portion of the stored knowledge that makes language possible is procedural knowledge. Stored chunks are procedural chunks, embedded in context not just cognitively and socially, but also embedded physically in the production and comprehension systems along whose paths they run, and also physically in the articulatory gestures and the manual gestures that are co-produced with them. Evidence for the procedural nature of linguistic chunks is the fact that they are affected by frequency of use. If linguistic knowledge were abstract, propositional knowledge, frequency would not be important.Our understanding of both language structure and language use is enhanced by the recognition that memory for language is highly affected by language use. The memory representation of language consists of units that can constitute utterances or intonation units, that is, not just words, but also phrases and constructions. The smaller units familiar from structural analysis--stem morphemes, grammatical morphemes--are not independent units, but rather emerge from these larger stored units via a network of connections among them.
References
Anderson, John R. 1993. Rules of the mind. Hillsdale, NJ: Erlbaum. Bock, J. K. 1986. Syntactic persistence in language production. Cognitive Psychology 18.355-387.
Boyland, Joyce Tang. 1997. Morphosyntactic change in progress: A psycholinguistic approach. Berkeley: University of California Doctoral Dissertation.Branigan, Holly P., Martin J. Pickering, Simon P. Liversedge, Andrew J. Stewart and Thomas P. Urbach. 1995. Syntactic priming: investigating the mental representation of language. Journal of Psycholinguistic Research 24.489-506.Bybee, Joan L. 1985. Morphology: A study of the relation between meaning and form. Philadelphia: Benjamins.----. 1988. Morphology as lexical organization. Theoretical morphology, ed. by M. Hammond & M. Noonan , 119-141. San Diego: Academic Press.-----. 1995. Regular morphology and the lexicon. Language and Cognitive Processes 10 (5): 425-455.-----. 1998a. Lexicalization of sound change and alternating environments. Laboratory Phonology V: Language acquisition and the lexicon, ed. by M. Broe and J. Pierrehumbert. Cambridge University Press.-----. 1998b. The Phonology of the lexicon: Evidence from lexical diffusion. Usage-Based models of language, ed. by M. Barlow and S. Kemmer. Stanford: CSLI.-----, and Joanne Scheibman. To appear. The effect of usage on degrees of constituency: the reduction of dont in English. Constituency and Discourse, ed. by S. Cumming. Amsterdam: John Benjamins.Croft, William. 1995. Intonation units and grammatical structure. Linguistics 33.839-882.Francis, W. Nelson and Henry Kucera. 1982. Frequency analysis of English usage. Boston: Houghton Mifflin.Goldberg, Adele. 1995. Constructions: A construction grammar approach to argument structure. Chicago: University of Chicago Press.Hayes, Bruce. 1990. Precompiled phrasal phonology. The phonology-syntaxconnection, edited by Sharon Inkelas and Draga Zec, 85-108. Chicago:University of Chicago Press.Hopper, Paul J. 1987. Emergent grammar. BLS 13:139-157.Kemmer, Suzanne. 1994. Pattern crystallization in syntactic change. Paperpresented at the Symposium on Synchronic and Diachronic Aspects of Grammaticalization. Sophienberg Slot, Rungsted, Denmark, October 9-11, 1994.Krug, Manfred. 1998. String frequency: a cognitive motivating factor in coalescence, language processing and linguistic change. MS.Langacker, Ronald. 1987. Foundations of cognitive grammar, Vol. 1. Theoretical prerequisites. Stanford: Stanford University Press.Levin, Beth. 1993. English verb classes and alternations. Chicago: University of Chicago Press.Lobben, Marrit. 1991. Pluralization of Hausa nouns, viewed from psycholinguistic experiments and child language data. M. Phil. Thesis, University of Oslo.Losiewicz, B. L. 1992. The effect of frequency on linguistic morphology. Austin:University of Texas Dissertation.MacDonald, Maryellen C. 1997. Lexical representations and sentence processing: An introduction. Language and Cognitive Processes 12.121-136.MacWhinney, B. 1978. The acquisition of morphophonology. Monographs of the Society for Research in Child Development, 43, no. 1.Miller, Joanne. 1994. The internal structure of phonetic categories: A progress report. Cognition 50.271-85.Nunberg, Geoffrey, Ivan A. Sag, and Thomas Wasow. 1994. Idioms. Language 70:491-538.Ono, Tsuyoshi and Sandra A. Thompson. 1995. What can conversation tell usabout syntax? Alternative Linguistics, ed. by Philip W. Davis, 213-271. Amsterdam and Philadelphia: John Benjamins.Pauly, Andrew, 1986. Lexicalization. Languages and linguistics: The interdependence of theory, ata and application, ed. by Deborah Tannen and James E. Alatis, 98-120. Georgetown University Round Table on Languages and Linguistics 1985). Washington DC: Georgetown University Press.-----, and Frances Hodgetts Syder. 1983. Two puzzles for linguistic theory:Nativelike selection and nativelike fluency. Language and Communication, ed. by Jack C. Richards and Richard W. Smith , 191-225. London and New York: Longmans.Phillips, Betty S. 1984. Word frequency and the actuation of sound change. Lg.60: 320-342.Poplack, Shana and Tagliamonte. 1996. Nothing in context: variation,grammaticization and past time marking in Nigerian Pidgin English. Changing meanings, changing functions: papers relating to grammaticalization in contact languages, ed. by P. Baker, 71-94. Westminster,U. K.: University Press.Scheibman, Joanne. 1997. I dunno but ... A usage-based account of the phonological reduction of dont in conversation. MS.Tannen, Deborah. 1989. Talking voices: repetition, dialogue, and imagery in conversational discourse. Cambridge: Cambridge University Press.Tranel, Bernard. 1981. Concreteness in generative phonology: evidence from French. Berkeley and Los Angeles: University of California Press.Weiner, Judith E. and William Labov. 1983. Constraints on the agentless passive. Journal of Linguistics 19.29-58.Zwicky, Arnold. 1987. French prepositions: no peeking. Phonology Yearbook 4.211-227. |