Sciweavers

ICASSP
2009
IEEE

Maximizing global entropy reduction for active learning in speech recognition

13 years 11 months ago
Maximizing global entropy reduction for active learning in speech recognition
We propose a new active learning algorithm to address the problem of selecting a limited subset of utterances for transcribing from a large amount of unlabeled utterances so that the accuracy of the automatic speech recognition system can be maximized. Our algorithm differentiates itself from earlier work in that it uses a criterion that maximizes the lattice entropy reduction over the whole dataset. We introduce our criterion, show how it can be simplified and approximated, and describe the detailed algorithm to optimize the criterion. We demonstrate the effectiveness of our new algorithm with directory assistance data collected under the real usage scenarios and show that our new algorithm consistently outperforms the confidence based approach by a significant margin. Using the algorithm cuts the number of utterances needed for transcribing by 50% to achieve the same recognition accuracy obtained using the confidence-based approach, and by 60% compared to the random sampling app...
Balakrishnan Varadarajan, Dong Yu, Li Deng, Alex A
Added 21 May 2010
Updated 21 May 2010
Type Conference
Year 2009
Where ICASSP
Authors Balakrishnan Varadarajan, Dong Yu, Li Deng, Alex Acero
Comments (0)