The 34th UW/MS Symposium in Computational Linguistics

Microsoft Research and University of Washington

Time: 3:30-5PM, Fri 11/14/2014

Location: Miller 301 at UW (Go to and search for “Miller”)

Come take advantage of this opportunity to connect with the computational
linguistics community at Microsoft and the University of Washington. This
is a regular opportunity for computational linguists at the University of
Washington and at Microsoft to discuss topics in the field and to connect
in a friendly informal atmosphere. We will have two talks (see below),
followed by informal mingling.


Natural Language Semantics by Combining Logical and Distributional Methods using Probabilistic Logic

Raymond J. Mooney,  UT Austin / Microsoft Research

Traditional logical approaches to semantics and newer distributional or
vector space approaches have complementary strengths and weaknesses.We have
developed methods that integrate logical and distributional models by using
a CCG-based parser to produce a detailed logical form for each sentence,
and combining the result with soft inference rules derived from
distributional semantics that connect the meanings of their component words
and phrases. For recognizing textual entailment (RTE) we use Markov Logic
Networks (MLNs) to combine these representations and we present results on
standard corpora emphasizing the advantages of combining logical structure
of sentences with statistical knowledge mined from large corpora.

Raymond J. Mooney is a Professor in the Department of Computer Science at
the University of Texas at Austin, but currently on leave at Microsoft
Research. He received his Ph.D. in 1988 from the University of Illinois at
Urbana/Champaign. He is an author of over 150 published research papers,
primarily in the areas of machine learning and natural language processing.
He was the President of the International Machine Learning Society from
2008-2011, program co-chair for AAAI 2006, general chair for HLT-EMNLP
2005, and co-chair for ICML 1990. He is a Fellow of the American
Association for Artificial Intelligence and the Association for Computing
Machinery, and the recipient of best paper awards from AAAI-96, KDD-04,
ICML-05 and ACL-07.


Combined Distributional and Logical Semantics

Mike Lewis, UW CSE / Allen Institute for AI

I will describe a new approach to semantics, which combines the benefits of
formal semantics and distributional approaches. Formal semantics offers an
elegant account of composition and logical operators, but typically shows
low recall due to inadequate models of lexical semantics. Conversely,
distributional semantics has been successful in describing the meanings of
content words, but it is unclear how to effectively represent composition
and function words in a vector space. I will introduce a model which
closely follows formal semantics, except that content words are represented
with distributional cluster-identifiers. I will show that it is capable of
both complex multi-sentence first-order inferences, while improving
performance on a question-answering task. I will then describe a
semi-supervised extension for building a richer lexical semantics.

Mike Lewis is a postdoc at the University of Washington and Allen AI,
working with Luke Zettlemoyer and Oren Etzioni. Previously, he completed a
PhD at the University of Edinburgh, supervised by Mark Steedman, and has a
Masters degree from Oxford University. He is interested in wide coverage
semantic and syntactic parsing, particularly in methods combining
Combinatory Categorical Grammar with unsupervised or semi-supervised


About lingadv

University of Washington Linguistics Undergraduate Advising
This entry was posted in Academic Talks, Events. Bookmark the permalink.