Sciweavers

Share
NAACL
2003

Factored Language Models and Generalized Parallel Backoff

8 years 10 months ago
Factored Language Models and Generalized Parallel Backoff
We introduce factored language models (FLMs) and generalized parallel backoff (GPB). An FLM represents words as bundles of features (e.g., morphological classes, stems, data-driven clusters, etc.), and induces a probability model covering sequences of bundles rather than just words. GPB extends standard backoff to general conditional probability tables where variables might be heterogeneous types, where no obvious natural (temporal) backoff order exists, and where multiple dynamic backoff strategies are allowed. These methodologies were implemented during the JHU 2002 workshop as extensions to the SRI language modeling toolkit. This paper provides initial perplexity results on both CallHome Arabic and on Penn Treebank Wall Street Journal articles. Significantly, FLMs with GPB can produce bigrams with significantly lower perplexity, sometimes lower than highly-optimized baseline trigrams. In a multi-pass speech recognition context, where bigrams are used to create first-pass bigram l...
Jeff Bilmes, Katrin Kirchhoff
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2003
Where NAACL
Authors Jeff Bilmes, Katrin Kirchhoff
Comments (0)
books