Sciweavers

ACL
2004

Attention Shifting for Parsing Speech

13 years 6 months ago
Attention Shifting for Parsing Speech
We present a technique that improves the efficiency of word-lattice parsing as used in speech recognition language modeling. Our technique applies a probabilistic parser iteratively where on each iteration it focuses on a different subset of the wordlattice. The parser's attention is shifted towards word-lattice subsets for which there are few or no syntactic analyses posited. This attention-shifting technique provides a six-times increase in speed (measured as the number of parser analyses evaluated) while performing equivalently when used as the first-stage of a multi-stage parsing-based language model.
Keith B. Hall, Mark Johnson
Added 30 Oct 2010
Updated 30 Oct 2010
Type Conference
Year 2004
Where ACL
Authors Keith B. Hall, Mark Johnson
Comments (0)