Sciweavers

Share
ICML
2006
IEEE

Connectionist temporal classification: labelling unsegmented sequence data with recurrent neural networks

12 years 2 months ago
Connectionist temporal classification: labelling unsegmented sequence data with recurrent neural networks
Many real-world sequence learning tasks require the prediction of sequences of labels from noisy, unsegmented input data. In speech recognition, for example, an acoustic signal is transcribed into words or sub-word units. Recurrent neural networks (RNNs) are powerful sequence learners that would seem well suited to such tasks. However, because they require pre-segmented training data, and post-processing to transform their outputs into label sequences, their applicability has so far been limited. This paper presents a novel method for training RNNs to label unsegmented sequences directly, thereby solving both problems. An experiment on the TIMIT speech corpus demonstrates its advantages over both a baseline HMM and a hybrid HMM-RNN.
Alex Graves, Faustino J. Gomez, Jürgen Schmid
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2006
Where ICML
Authors Alex Graves, Faustino J. Gomez, Jürgen Schmidhuber, Santiago Fernández
Comments (0)
books