Sciweavers

ICML
2004
IEEE

Learning random walk models for inducing word dependency distributions

14 years 5 months ago
Learning random walk models for inducing word dependency distributions
Many NLP tasks rely on accurately estimating word dependency probabilities P(w1|w2), where the words w1 and w2 have a particular relationship (such as verb-object). Because of the sparseness of counts of such dependencies, smoothing and the ability to use multiple sources of knowledge are important challenges. For example, if the probability P(N|V ) of noun N being the subject of verb V is high, and V takes similar objects to V , and V is synonymous to V , then we want to conclude that P(N|V ) should also be reasonably high--even when those words did not cooccur in the training data. To capture these higher order relationships, we propose a Markov chain model, whose stationary distribution is used to give word probability estimates. Unlike the manually defined random walks used in some link analysis algorithms, we show how to automatically learn a rich set of parameters for the Markov chain's transition probabilities. We apply this model to the task of prepositional phrase attach...
Kristina Toutanova, Christopher D. Manning, Andrew
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2004
Where ICML
Authors Kristina Toutanova, Christopher D. Manning, Andrew Y. Ng
Comments (0)