An Introduction to Stochastic Attribute-Value Grammars
by Rob Malouf, Miles Osborne
Publisher: ESSLLI 2001
Number of pages: 159
This one-week course will provide an introduction to the maximum entropy principle and the construction of maximum entropy models for natural language processing. Through a combination of lectures and, as local computing facilities permit, hands-on lab exercises, students will investigate the implementation of maximum entropy models for attribute-value grammars, including such topics as ambiguity identification, feature selection, and model training and evaluation.
Home page url
Download or read it online for free here:
by Michael A. Covington - Prentice-Hall
Designed to bridge the gap for those who know Prolog but have no background in linguistics, this book concentrates on turning theories into practical techniques. Coverage includes template and keyword systems, definite clause grammars, and more.
by Edward Stabler - UCLA
What kind of computational device could use a system like a human language? This text explores the computational properties of devices that could compute morphological and syntactic analyses, and recognize semantic relations among sentences.
by F. C. N. Pereira, S. M. Shieber - Center for the Study of Language
A concise introduction to logic programming and the logic-programming language Prolog both as vehicles for understanding elementary computational linguistics and as tools for implementing the basic components of natural-language-processing systems.
by A. L. Berger, S. A. Della Pietra, V. J. Della Pietra - Association for Computational Linguistics
The authors describe a method for statistical modeling based on maximum entropy. They present a maximum-likelihood approach for automatically constructing maximum entropy models and describe how to implement this approach efficiently.