Stochastic Attribute-Value Grammars
by Rob Malouf, Miles Osborne
Publisher: ESSLLI 2001
Number of pages: 159
This one-week course will provide an introduction to the maximum entropy principle and the construction of maximum entropy models for natural language processing. Through a combination of lectures and, as local computing facilities permit, hands-on lab exercises, students will investigate the implementation of maximum entropy models for attribute-value grammars, including such topics as ambiguity identification, feature selection, and model training and evaluation.
Download or read it online for free here:
by Doug Arnold, at al. - Blackwell Pub
This introductory book looks at all aspects of Machine Translation: covering questions of what it is like to use a modern Machine Translation system, through questions about how it is done, to questions of evaluating systems, and more.
by Jon Barwise, John Etchemendy - Center for the Study of Language
The book covers the boolean connectives, formal proof techniques, quantifiers, basic set theory, induction, proofs of soundness and completeness for propositional and predicate logic, and an accessible sketch of Godel's first incompleteness theorem.
by Roger Levy - University of California, San Diego
A book on the topic of using probabilistic models in scientific work on language ranging from experimental data analysis to corpus work to cognitive modeling. The intended audience is graduate students in linguistics, psychology and computer science.
by Shuly Wintner - ESSLLI
This text is a mild introduction to Formal Language Theory for students with little or no background in formal systems. The motivation is Natural Language Processing, and the presentation is geared towards NLP applications, with extensive examples.