Please use this identifier to cite or link to this item: http://hdl.handle.net/1783.1/1514

Constructive algorithms for structure learning in feedforward neural networks

Authors Kwok, Tin-Yau
Issue Date 1996
Summary In recent years, multi-layer feedforward neural networks have been popularly used for pattern classification, function approximation and regression problems. Methods using standard back-propagation learning algorithm perform gradient descent only in the weight space of a network with fixed topology. Recently, various researchers have investigated different approaches that alter the network topology as learning proceeds. In this thesis, I concentrate on constructive algorithms for structure learning in feedforward neural networks for regression problems. The basic idea of constructive algorithms is to start with a small network, then add hidden units and weights incrementally until a satisfactory solution is found. There are hurdles that constructive algorithms have to overcome, including: 1. How to train the new hidden unit? 2. Whether the constructive algorithms can produce a neural network function that is as close to an arbitrary target function as desired? 3. How to control the complexity of the new hidden unit? To address these issues, I develop a number of objective functions for training new hidden units. The theoretical convergence properties of a number of constructive algorithms, in which hidden units are added one by one in a greedy manner, are also examined. Moreover, I study how the integration of Bayesian regularization and constructive algorithms can lead to improved network performance. The approach is promising in that the regularization parameters can be automatically controlled in a disciplined manner without requiring manual setting.
Note Thesis (Ph.D.)--Hong Kong University of Science and Technology, 1996
Subjects
Language English
Format Thesis
Access
Files in this item:
File Description Size Format
th_redirect.html 341 B HTML