Come take advantage of this opportunity to connect with the computational linguistics community at Microsoft and the University of Washington. This is a regular opportunity for computational linguists at the University of Washington and at Microsoft to discuss topics in the field and to connect in a friendly informal atmosphere (more info is at http://depts.washington.edu/uwcl/msuw/symposium.html [depts.washington.edu]). We will have two talks (see below), followed by informal mingling.
Making Reading More Effective: Technologies to Help Information Seekers
Lucy Vanderwende and Sumit Basu, Microsoft Research
Despite the best efforts of video-based courseware, those of us who deal with information on a daily basis won’t be freed from the task of reading anytime soon. In this talk, we will describe several efforts directed towards making reading more effective – when users are reading to learn, how can we help them best use their time for this task? We are investigating this area along three axes: improving mastery, improving coverage, and improving engagement. We will focus today on helping the user gain mastery via the “mastery loop,” in which the user is assessed on their knowledge and then directed to focus attention where their knowledge is weakest. In particular, we have been working on technologies to automate both question generation  and the grading of the students’ answers. Our goal is to provide these tools for arbitrary topics in order to support both traditional educational scenarios as well as lifelong learning. We will also give a brief overview of some recent and future work on the other axes of improving coverage (knowing what you’ve read and what to read next) as well as improving engagement (getting people to read more).
 Lee Becker, Sumit Basu, and Lucy Vanderwende. “Mind the Gap: Learning to Choose Gaps for Question Generation.” In Proceedings of NAACL 2012.
A Probabilistic Model of Language Acquisition from Utterance and Meaning
Tom Kwiatkowski, UW CSE
In learning their first language, children must learn the meanings of words and the syntactic mechanism by which these word meanings can be composed to give the meanings of sentences and phrases. The cognitive theory of language acquisition holds that both of these are learnt through the process of mapping the words heard in an utterance onto some contextually afforded interpretation of what that utterance may mean.
In this talk I present a probabilistic model of language acquisition that uses a psycholinguistically plausible learning algorithm to learn word meanings and syntax from child-directed utterances annotated with logical approximations of the context in which they appear. I show that the approach is able to learn an accurate parsing model for the target language even in the face of ambiguous training data. Furthermore, I show that both word-meanings and syntactic rules are learnt in a manner that correlates with observations of language learning in children, overcoming criticisms of previous statistical models of language acquisition.
Tom Kwiatkowski is a post-doctoral researcher working at the University of Washington on building computational systems capable of natural language understanding. He received a PhD for his thesis titled Probabilistic Grammar Induction from Sentences and Structured Meanings from the University of Edinburgh in 2012.
The University of Washington is committed to providing access, equal opportunity, and reasonable accommodations in its services, programs, activities, education, and employment for individuals with disabilities.
To request disability accommodations, please contact the Office of the ADA Coordinator in advance. 545-6450 (voice); 543-6452 (TDD); email@example.com (e-mail).