Please use this identifier to cite or link to this item:
https://hdl.handle.net/20.500.11851/6058
Title: | A Comparison of Architectural Varieties in Radial Basis Function Neural Networks | Authors: | Efe, Mehmet Önder Kasnakoğlu, Coşku |
Keywords: | [No Keywords] | Publisher: | IEEE | Source: | International Joint Conference on Neural Networks -- JUN 01-08, 2008 -- Hong Kong, PEOPLES R CHINA | Series/Report no.: | IEEE International Joint Conference on Neural Networks (IJCNN) | Abstract: | Representation of knowledge within a neural model is an active field of research involved with the development of alternative structures, training algorithms, learning modes and applications. Radial Basis Function Neural Networks (RBFNNs) constitute an important part of the neural networks research as the operating principle is to discover and exploit similarities between an input vector and a feature vector. In this paper, we consider nine architectures comparatively in terms of learning performances. Levenberg-Marquardt (LM) technique is coded for every individual configuration and it is seen that the model with a linear part augmentation performs better in terms of the final least mean squared error level in almost all experiments. Furthermore, according to the results, this model hardly gets trapped to the local minima. Overall, this paper presents clear and concise figures of comparison among 9 architectures and this constitutes its major contribution. | URI: | https://doi.org/10.1109/IJCNN.2008.4633768 https://hdl.handle.net/20.500.11851/6058 |
ISBN: | 978-1-4244-1820-6 | ISSN: | 2161-4393 |
Appears in Collections: | Elektrik ve Elektronik Mühendisliği Bölümü / Department of Electrical & Electronics Engineering Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection |
Show full item record
CORE Recommender
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.