Probabilistic Models in the Study of Language
by Roger Levy
Publisher: University of California, San Diego 2012
Number of pages: 274
A textbook on the topic of using probabilistic models in scientific work on language ranging from experimental data analysis to corpus work to cognitive modeling. The intended audience is graduate students in linguistics, psychology, cognitive science, and computer science who are interested in using probabilistic models to study language.
Home page url
Download or read it online for free here:
by F. C. N. Pereira, S. M. Shieber - Center for the Study of Language
A concise introduction to logic programming and the logic-programming language Prolog both as vehicles for understanding elementary computational linguistics and as tools for implementing the basic components of natural-language-processing systems.
by Doug Arnold, at al. - Blackwell Pub
This introductory book looks at all aspects of Machine Translation: covering questions of what it is like to use a modern Machine Translation system, through questions about how it is done, to questions of evaluating systems, and more.
by Steven Bird, Ewan Klein, Edward Loper - O'Reilly Media
This book offers a highly accessible introduction to natural language processing, the field that supports a variety of language technologies. With it, you'll learn how to write Python programs that work with large collections of unstructured text.
by Rob Malouf, Miles Osborne - ESSLLI
This text provides an introduction to the maximum entropy principle and the construction of maximum entropy models for natural language processing. We investigate the implementation of maximum entropy models for attribute-value grammars.