An Introduction to Stochastic Attribute-Value Grammars
by Rob Malouf, Miles Osborne
Publisher: ESSLLI 2001
Number of pages: 159
This one-week course will provide an introduction to the maximum entropy principle and the construction of maximum entropy models for natural language processing. Through a combination of lectures and, as local computing facilities permit, hands-on lab exercises, students will investigate the implementation of maximum entropy models for attribute-value grammars, including such topics as ambiguity identification, feature selection, and model training and evaluation.
Home page url
Download or read it online for free here:
by Daniël de Kok, Harm Brouwer
We will go into many of the techniques that so-called computational linguists use to analyze the structure of human language, and transform it into a form that computers work with. We chose Haskell as the main programming language for this book.
by Igor Boshakov, Alexander Gelbukh
The book focuses on the basic set of ideas and facts from the fundamental science necessary for the creation of intelligent language processing tools, without going deeply into the details of specific algorithms or toy systems.
by Shuly Wintner - ESSLLI
This text is a mild introduction to Formal Language Theory for students with little or no background in formal systems. The motivation is Natural Language Processing, and the presentation is geared towards NLP applications, with extensive examples.
by A. Aliseda, R. van Glabbeek, D. Westerstahl - CSLI
This book pursues the recent research in the interface of logic, language and computation, with applications to artificial intelligence and machine learning. It contains contributions to the logical and computational analysis of natural language.