Sciweavers

INTERSPEECH
2010

On the relation of Bayes risk, word error, and word posteriors in ASR

12 years 11 months ago
On the relation of Bayes risk, word error, and word posteriors in ASR
In automatic speech recognition, we are faced with a wellknown inconsistency: Bayes decision rule is usually used to minimize sentence (word sequence) error, whereas in practice we want to minimize word error, which also is the usual evaluation measure. Recently, a number of speech recognition approaches to approximate Bayes decision rule with word error (Levenshtein/edit distance) cost were proposed. Nevertheless, experiments show that the decisions often remain the same and that the effect on the word error rate is limited, especially at low error rates. In this work, further analytic evidence for these observations is provided. A set of conditions is presented, for which Bayes decision rule with sentence and word error cost function leads to the same decisions. Furthermore, the case of word error cost is investigated and related to word posterior probabilities. The analytic results are verified experimentally on several large vocabulary speech recognition tasks.
Ralf Schlüter, Markus Nußbaum-Thom, Her
Added 18 May 2011
Updated 18 May 2011
Type Journal
Year 2010
Where INTERSPEECH
Authors Ralf Schlüter, Markus Nußbaum-Thom, Hermann Ney
Comments (0)