Fast constructive-covering approach for neural networks

Di Wang, Narendra S. Chaudhari, Jagdish Chandra Patra

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We propose a fast training algorithm called Fast Constructive-Covering Approach (FCCA) for neural network construction based on geometrical expansion. Parameters are updated according to the geometrical location of the training samples in the input space, and each sample in the training set is learned only once. By doing this, FCCA is able to avoid iterations and is much faster than traditional training algorithms. Given an input sequence in an arbitrary order, FCCA learns 'easy' samples first and the 'confusing' samples are easily learned after these 'easy' samples. This sample reordering process is done on the fly based on geometrical concept. A comparison of this method with a few other methods on the well-known Iris data set is given.

Original languageBritish English
Title of host publicationProceedings of the International Joint Conference on Neural Networks, IJCNN 2005
Pages2167-2172
Number of pages6
DOIs
StatePublished - 2005
EventInternational Joint Conference on Neural Networks, IJCNN 2005 - Montreal, QC, Canada
Duration: 31 Jul 20054 Aug 2005

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume4

Conference

ConferenceInternational Joint Conference on Neural Networks, IJCNN 2005
Country/TerritoryCanada
CityMontreal, QC
Period31/07/054/08/05

Fingerprint

Dive into the research topics of 'Fast constructive-covering approach for neural networks'. Together they form a unique fingerprint.

Cite this