Sciweavers

ICONIP
2007

RNN with a Recurrent Output Layer for Learning of Naturalness

13 years 5 months ago
RNN with a Recurrent Output Layer for Learning of Naturalness
– The behavior of recurrent neural networks with a recurrent output layer (ROL) is described mathematically and it is shown that using ROL is not only advantageous, but is in fact crucial to obtaining satisfactory performance for the proposed naturalness learning. Conventional belief holds that employing ROL often substantially decreases the performance of a network or renders the network unstable, and ROL is consequently rarely used. The objective of this paper is to demonstrate that there are cases where it is necessary to use ROL. The concrete example shown models naturalness in handwritten letters. Keywords – recurrent output layer, RNN, ESN, naturalness learning, handwritten letters
Ján Dolinský, Hideyuki Takagi
Added 29 Oct 2010
Updated 29 Oct 2010
Type Conference
Year 2007
Where ICONIP
Authors Ján Dolinský, Hideyuki Takagi
Comments (0)