Abstract Katrin Erk

Word Meanings as Messy Groups of Usages, and How to Integrate Them into Sentence Meaning Representations, Katrin Erk (University of Texas at Austin)

Textbook examples of lexical ambiguity often have clearly distinct senses, as in mouth of a person versus mouth of a river. But in manual word meaning annotation we see many tricky cases that seem to match multiple dictionary senses, or stretch existing dictionary senses. Context-based computational models can describe word meaning in a graded and flexible fashion that seems to fit these tricky cases well: with soft clusters of word usages at different degrees of similarity. These models, which are learned from large amounts of text data, can be viewed as a kind of compressed corpus of speakers and can be probed for the regularities that they observe. One interesting thing they pick up on is an influence of larger narrative frames on word meaning in context.  

If word meanings come in messy soft clusters that are additionally influenced by contextual modulation, then how can we integrate word meaning representations into sentence meaning representations, which are often stated as logical forms (or logic-like symbolic forms)? I will discuss a first step in this direction, which links word representations to weighted constraints that model interacting context effects on word meaning.