In Defense of Contextual Vocabulary Acquisition:

In Defense of Contextual Vocabulary Acquisition:

In Defense of Contextual Vocabulary Acquisition: How to Do Things with Words in Context William J. Rapaport Department of Computer Science & Engineering, Department of Philosophy, and Center for Cognitive Science State University of New York at Buffalo Buffalo, NY 14260 http://www.cse.buffalo.edu/~rapaport/CVA/ Contextual Vocabulary Acquisition CVA = active, deliberate acquisition of a meaning for a word in a text by reasoning from context context textual context surrounding words; co-text

context = wide context = internalized co-text readers interpretive mental model of textual co-text integrated via belief revision infer new beliefs from internalized co-text + prior knowledge remove inconsistent beliefs with readers prior knowledge including language knowledge including previous hypotheses about words meaning but not including external sources (dictionary, humans) context for CVA is in readers mind, not in the text Overview

CVA project: 1. 2. Current status: Have theory Have computational implementation

Know that people do incidental CVA Possibly best explanation of how we learn vocabulary computational theory of how to figure out (compute) a meaning for an unfamiliar word from wide context. convert algorithms to a teachable curriculum given # of words high-school grad knows (~45K), & # of years to learn them (~18) = ~2.5K words/year but only taught ~10% in 12 school years 2 groups of researchers say CVA cant be done (well)

This talk: Why theyre wrong. What Does Brachet Mean? (From Malorys Morte DArthur [page # in brackets]) There came a white hart running into the hall with a white brachet next to him, and thirty couples of black hounds came running after them. [66] 2. As the hart went by the sideboard, the white brachet bit him. [66] 3. The knight arose, took up the brachet and rode away with the brachet. [66] 4. A lady came in and cried aloud to King Arthur, Sire, the brachet is mine. [66] 10. There was the white brachet which bayed at him fast. [72] 18. The hart lay dead; a brachet was biting on his throat, and other hounds came behind. [86]

1. Computational CVA Based on Karen Ehrlichs CS Ph.D. dissertation (1995) Implemented in SNePS KRRA system KB: SNePS representation of readers prior knowledge I/P: SNePS representation of word & co-text Processing: Inferences drawn/belief revision during text input Simulates reading

N & V definition algorithms deductively search this belief-revised, integrated KB (the context) for definitional information O/P: Definition frame slots (features): classes, structure, actions, properties, etc. fillers (values): info gleaned from context (= integrated KB) A hart runs into King Arthurs hall. A white brachet is next to the hart. The brachet bites the harts buttock. The knight picks up the brachet. The knight carries the brachet. The lady says that she wants the brachet. The brachet bays at Sir Tor. [background knowledge: only hunting dogs bay]

--> (defineNoun "brachet") Definition of brachet: Class Inclusions: hound, dog, Possible Actions: bite buttock, bay, hunt, Possible Properties: valuable, small, white, I.e. A brachet is a hound (a kind of dog) that can bite, bay, and hunt, and that may be valuable, small, and white. General Comments Systems behavior human protocols Systems definition OEDs definition: = A brachet is a kind of hound which hunts by scent I. Are All Contexts Created Equal? Beck, McKeown, & McCaslin (1983), Vocabulary Development: Not All Contexts Are Created Equal

Elementary School Journal 83(3): 177-181. it is not true that every context is an appropriate or effective instructional means for vocabulary development Role of Prior Knowledge Beck et al: co-text can give clues to the words meaning But clue is relative: clues need other info to be seen as clues Implication A1: textual clues need to be supplemented with other information to compute a meaning.

Supplemental info = readers prior knowledge has to be accessible to reader will be idiosyncratic Co-text doesnt suffice; prior knowledge needed Do Words Have Unique, Correct Meanings? Beck et al. (& others) assume: A2: A word has a unique meaning A3: A word has a correct meaning Contra unique: A words meaning varies with: co-text reader(s prior knowledge) time of reading Correct is a red herring (in any case, its fishy):

Possibly, words have author-intended meanings but these need not be determined by context (textual or wide) Misunderstandings are universally unavoidable * Perfect understanding/dictionary definition not needed understanding for passage comprehension suffices reader can always revise definition hypothesis Beck et al.s Categories of Textual Contexts What kinds of co-texts are helpful? But keep in mind that we have different goals: Beck et al.: use co-text to teach correct word meanings CCVA: use context to compute word meaning for understanding

Beck et al.s Textual Context Categories Top-Level Kinds of Co-Text Pedagogical co-texts: artificially constructed, designed for teaching Natural co-texts: not intended to convey the meaning of a word 4 kinds (actually, a continuum) Beck et al.s Textual Context Categories 1. Misdirective (Natural) Co-Texts seem to direct student to incorrect meaning for a word sole example: Sandra had won the dance contest and the audiences cheers brought her to

the stage for an encore. Every step she takes is so perfect and graceful, Ginny said grudgingly, as she watched Sandra dance. [[grudgingly]] =? admiringly Is this a natural context? Is this all there is to it?.. A4: Co-texts have a fixed, usually small size But larger co-text might add information Prior knowledge can widen the context grudgingly is an adverb! A5: All words are equally easy to learn But N easier than V, V easier than Adj/Adv! (Granger/Gentner/..Gleitman..) A6: Only 1 co-text can be used. But later co-texts can assist in refining meaning

Beck et al.s Textual Context Categories 2. Nondirective (Natural) Co-Texts of no assistance in directing the reader toward any particular meaning for a word sole example is for an adjective: Dan heard the door open and wondered who had arrived. He couldnt make out the voices. Then he recognized the lumbering footsteps on the stairs and knew it was Aunt Grace. But:

Is it natural? What about larger co-text? An adjective! Of no assistance? (see next slide) Syntactic Manipulation Misdirective & nondirective contexts can yield correct information! Cf. algebraic manipulation (brings x into focus): 2x + 1 = 7 / x = (7 1)/2 = 6/2 = 3 Syntactic manipulation (bring hard word into focus): Every step she takes is so perfect and graceful, Ginny

said grudgingly. Grudgingly is the way that Ginny said So, grudgingly is a way of saying something In particular, grudgingly is a way of (apparently) praising someones performance he recognized the lumbering footsteps on the stairs lumbering is a property of footsteps on stairs Generates initial hypothesis for later refinement Beck et al.s Textual Context Categories 3. General (Natural) Co-Texts provide enough information for reader to place word in a general category sole example is for an adjective:

Joe and Stan arrived at the party at 7:00. By 9:30 the evening seemed to drag for Stan. But Joe really seemed to be having a good time at the party. I wish I could be as gregarious as he is, thought Stan Same problems, but: adjective is contrasted with Stans attitude contrasts are good (so are parallel constructions) Beck et al.s Textual Context Categories 4. Directive (Natural) Co-Texts seem likely to lead the student to a specific, correct meaning for a word sole example is for a noun: When the cat pounced on the dog, he leapt up, yelping, and knocked over a shelf of books. The animals ran past

Wendy, tripping her. She cried out and fell to the floor. As the noise and confusion mounted, Mother hollered upstairs, Whats all the commotion? Natural? Long! Noun! note that the sole example of a directive context is a noun, suggesting that it might be the word that makes a context directive Beck et al.s Experiment Ss given passages from basal readers (reading textbooks) Researchers categorized co-texts & blacked out words Ss asked to fill in the blanks with the missing words or reasonable synonyms Results confirm 4 co-text types Independently of results, there are methodological questions:

Are basal readers natural contexts? How large were co-texts? Instruction on how to do CVA? A7: CVA comes naturally, so needs no training A8: Fill-in-the-blank tasks are a form of CVA No, theyre not! (see next slide) Beck et al.s Experiment CVA, Neologisms, & Fill-in-the-Blank Serious methodological problem for all of us: Replacing word with blank or neologism misleads Ss to find correct missing/hidden word CVA! Our (imperfect) solution:

use plausible-sounding neologism tell S its like a foreign word with no English equivalent, hence need a descriptive phrase Beck et al.s Conclusion less skilled readers receive little benefit from CVA A9: CVA can only help in learning correct meanings. But: CVA uses same techniques as general reading comprehension: careful, slow reading careful analysis of text

directed search for information useful for computing a meaning application of relevant prior knowledge application of reasoning for purpose of extracting information from text CVA, if properly taught & practiced, can improve general reading comprehension II. Are Context Clues Unreliable Predictors of Word Meanings? Schatz & Baldwin (1986): Context Clues Are Unreliable Predictors of Word Meanings Reading Research Quarterly 21(4): 439-453. context does not usually provide clues to the meanings of low-frequency words context clues inhibit the correct prediction of word

meanings just as often as they facilitate them S&Bs Argument A10: CVA is not an efficient mechanism for inferring word meanings. Because: Co-text cant help you figure out the correct meaning of an unfamiliar word. (uniqueness & correctness assumptions again!) But, we argue: Wide context can help you figure out a meaning for an unfamiliar word. So, context (& CVA) are efficient mechanisms for inferring (better: computing) word meanings. Incidental vs. Deliberate CVA

S&B: context clues should help readers to infer meanings of words without the need for readers to interrupt the reading act(*) with diversions to external sources (*) true for incidental CVA (*) not for deliberate CVA External sources are no solution anyway: Dictionary definitions are just more co-texts! (Schwartz 1988) CVA is base case of recursion, one of whose recursive clauses is: Look it up in a dictionary S&Bs Experiments 25 natural passages from novels words chosen (the only cited examples): Adj/Adv N

V ~67% ~27% ~ 6% But: what are actual %s? which lexical categories were hardest? how do facilitative/confounding co-texts correlate with lexical category? should have had representative sample of 4 co-text categories X 3 or 4 lexical categories S&Bs Experiments CVA vs. Word-Sense Disambiguation

2 experiments: Ss chose correct meaning from list of 5 possible meanings This is WSD, not CVA! WSD = multiple choice CVA = essay question 3rd experiment: real CVA, but interested only in full denotative meanings or accurate synonyms cf. assumption A3 about correct meanings! S&Bs Experiments Space & Time Limits Space limits on size of co-text? S&B: 3 sentences

CCVA: start small, work outward Time limits on size of co-text? S&B: all students finished in allotted time CCVA: no time limits S&Bs Experiments Teaching CVA S&B: did not control for Ss knowledge of how to use context clues CCVA: deliberate CVA is a skill needs to be taught, modeled, & practiced there is other (later) evidence that such training works including critical thinking education

S&Bs 3 Questions (answered in the negative) 1. Do context clues occur with sufficient frequency to justify them as a major element of reading instruction? a) Context clues do occur, & teaching them is justified, if augmented by readers prior knowledge & knowledge of CVA skills. 2. Does context usually provide accurate clues to denotations & connotations of low-frequency words? a) CVA can provide clues to revisable hypotheses about unfamiliar words meaning

3. Are difficult words in natural [co-texts] usually amenable to such analysis? a) Such words are always amenable to yielding at least some information about their meaning. Our CVA Theory 1. Every co-text can give some clue to a words meaning. 2. Co-text clues must be supplemented by readers prior knowledge. a) Value of co-text depends on readers prior knowledge & ability to integrate them. 3. CVA fill-in-the-blank; CVA WSD 4. Co-text size has no arbitrary limits 5. May need lots of co-texts before CVA can asymptotically approach a stable meaning.

Our CVA Theory (continued) 6. A word does not have a unique meaning 7. A word does not have a correct meaning a) A words correct (intended) meaning does not need to be known in order for reader to understand it in context b) Even familiar/well-known words can acquire new meanings in new contexts. c) Neologisms usually are learned from context. 8. Some words are easier to compute meanings for than others (N < V < Adj/Adv) 9. CVA is an efficient method for computing word meanings. 10. CVA can improve general reading comprehension

Our CVA Theory (continued) 11. CVA can (and should) be taught! using a curriculum based on our algorithms

Recently Viewed Presentations

  • Poetrys Structure and Form Poetrys Rhythm Rhythm gives

    Poetrys Structure and Form Poetrys Rhythm Rhythm gives

    - a poem that is sad and thoughtful, and often said in lament of a person who died. ... The boy's room was messy. ... Figurative: You have to figure it out . The boy's room was a pigsty. The...
  • Everything You Wanted to Know about the JPML

    Everything You Wanted to Know about the JPML

    And whatever you do, DON'T… Fail to promptly notify the Panel of developments in the litigation (e.g., new potential tag along action(s), involvement of new districts, development that moots the motion or fully disposes of any action on a motion,...
  • Energy &amp; Chemical Reactions

    Energy & Chemical Reactions

    Nature of Reactants. Temperature. As temp. ↑, reactant particles speed up = more collisions. Increasing the temp. 10oC doubles reaction rate. Surface area - More surface area allows more contact between reactants = faster reaction rate. Concentration - ↑ concentration...
  • Frequency Distributions and Their graphs

    Frequency Distributions and Their graphs

    Sect. 2-1 Frequency Distributions and Their graphs. Objective SWBAT construct a frequency distribution including limits ,boundaries , midpoints, relative frequencies, and cumulative frequencies. Also how to construct frequency histograms, frequency polygons, , relative frequency histograms, and ogives.
  • Planning for the digital natives NNECAPA annual conference

    Planning for the digital natives NNECAPA annual conference

    Published last week Authors work for Berkman Center for Internet and Society at Harvard U Should be an interesting read 8 years old: That's when the dot com bubble burst 10 years old: Apple releases the first viable portable music...
  • Japanese Music By: Franz & Heather Cherry Blossom

    Japanese Music By: Franz & Heather Cherry Blossom

    Japanese Music By: Franz & Heather Cherry Blossom Tree Music & Culture of Japan Japan has similar styles to Korea The Japanese word for music is "ongaku" Traditional Japanese robes are "Kimonos" Pants in Japanese are called "Hakamo" Kimonos have...
  • Vocabulaire - laclassedemallory.files.wordpress.com

    Vocabulaire - laclassedemallory.files.wordpress.com

    Objectif de la séance. Aujourd'hui, nous allons travailler en grammaire. Nous allons revoir ce qu'est un déterminant et nous allons apprendre à identifier les différents types de déterminants.
  • Marine Corps Enlisted Administrative Separation (Mceas)

    Marine Corps Enlisted Administrative Separation (Mceas)

    The IPAC/admin will report the EAS CofG M on the unit diary upon notification from the unit Commander and validation that the Marine is in a limited duty status. Reporting of the entry will result in the EAS being established...