Please use this identifier to cite or link to this item:

Objective functions for training new hidden units in constructive neural networks

Authors Kwok, Tin Yau
Yeung, Dit Yan
Issue Date 1997
Source IEEE transactions on neural networks , v. 8, (5), 1997, SEP, p. 1131-1148
Summary In this paper, we study a number of objective functions for training new hidden units in constructive algorithms for multilayer feedforward networks. The aim is to derive a class of objective functions the computation of which and the corresponding weight updates can be done in O(N) time, where N is the number of training patterns. Moreover, even though. input weight freezing is applied during the process for computational efficiency, the convergence property of the constructive algorithms using these objective functions is still preserved. We also propose a few computational tricks that can be used to improve the optimization of the objective functions under practical situations. Their relative performance in a set of two-dimensional regression problems is also discussed.
ISSN 1045-9227
Rights © 1997 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.
Language English
Format Article
Access View full-text via Web of Science
View full-text via Scopus
Files in this item:
File Description Size Format
tnn97b.pdf 509092 B Adobe PDF