I am a professor in the Linguistics department and CS department at the University of Texas at Austin, and part of UT Austin NLP and computational linguistics. My area is computational linguistics, specifically computational semantics.
I am interested in polysemous words, words that have multiple related meanings. For example, you can leave the gym (for now), leave the country (for good), leave a tip, leave your backpack at the train station, leave your pearls to your niece, and the wine can leave a stain. How many different senses of "leave" are there, and by what criteria? How can we characterize the properties and connotations of the different uses? And how does the context in an individual sentence influence and subtly modulate the meaning of the word? I use word embeddings (a.k.a. distributional models, semantic spaces) and contextualized word embeddings to explore word meaning .These are meaning representations that are automatically calculated from text data. They capture subtle regularities and patterns in how people use words. By studying the computational models, I can study those regularities and patterns.
I would also like to know how flexible, nuanced and graded representations of word meaning (such as the approximations that we get from word embeddings) can be integrated with sentence meaning. What does it mean, from a theoretical perspective, to adopt such a view of word meaning? What does it mean, from a theoretical perspective, to integrate such word meanings with sentence meaning? And what formalisms can be used to express such a combination?
I am also interested in narrative schemas and scripts, typical narratives that show up over and over again. Narrative schemas can be learned from text data, and large language models pick up a lot of narrative schema knowledge. Narrative schemas can be used for inferences. They are also relevant in the context of lexical semantics, as we cak ask what kinds of narratives tend to show up in the surroundings of a word, and what that has to say about the word's meaning and connotations.
Integrating flexible, graded representations of word meanings with representations of sentence meaning. In particular logic-based representations. How to combine the two at a technical level? And what does it mean for our conception of word meaning and sentence meaning?
Natural Language Learning reading group (join us!)
Fall 2023: by appointment, please email me.
Katrin Erk, Linguistics Department, 4.734 Patton Hall (RLP)
305 E 23rd ST B5100, Austin, TX, USA 78712
email: katrin DOT erk AT utexas DOT edu