Sciweavers

IJCNN
2008
IEEE

A comparison of architectural varieties in Radial Basis Function Neural Networks

13 years 10 months ago
A comparison of architectural varieties in Radial Basis Function Neural Networks
— Representation of knowledge within a neural model is an active field of research involved with the development of alternative structures, training algorithms, learning modes and applications. Radial Basis Function Neural Networks (RBFNNs) constitute an important part of the neural networks research as the operating principle is to discover and exploit similarities between an input vector and a feature vector. In this paper, we consider nine architectures comparatively in terms of learning performances. LevenbergMarquardt (LM) technique is coded for every individual configuration and it is seen that the model with a linear part augmentation performs better in terms of the final least mean squared error level in almost all experiments. Furthermore, according to the results, this model hardly gets trapped to the local minima. Overall, this paper presents clear and concise figures of comparison among 9 architectures and this constitutes its major contribution.
Mehmet Önder Efe, Cosku Kasnakoglu
Added 31 May 2010
Updated 31 May 2010
Type Conference
Year 2008
Where IJCNN
Authors Mehmet Önder Efe, Cosku Kasnakoglu
Comments (0)