Fast constructive-covering approach for neural networks

Di Wang, Narendra S. Chaudhari, Jagdish Chandra Patra

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


We propose a fast training algorithm called Fast Constructive-Covering Approach (FCCA) for neural network construction based on geometrical expansion. Parameters are updated according to the geometrical location of the training samples in the input space, and each sample in the training set is learned only once. By doing this, FCCA is able to avoid iterations and is much faster than traditional training algorithms. Given an input sequence in an arbitrary order, FCCA learns 'easy' samples first and the 'confusing' samples are easily learned after these 'easy' samples. This sample reordering process is done on the fly based on geometrical concept. A comparison of this method with a few other methods on the well-known Iris data set is given.

Original languageBritish English
Title of host publicationProceedings of the International Joint Conference on Neural Networks, IJCNN 2005
Number of pages6
StatePublished - 2005
EventInternational Joint Conference on Neural Networks, IJCNN 2005 - Montreal, QC, Canada
Duration: 31 Jul 20054 Aug 2005

Publication series

NameProceedings of the International Joint Conference on Neural Networks


ConferenceInternational Joint Conference on Neural Networks, IJCNN 2005
CityMontreal, QC


Dive into the research topics of 'Fast constructive-covering approach for neural networks'. Together they form a unique fingerprint.

Cite this