Abstract
This paper proposes a new nonparametric regression method, based on the combination of generalized regression neural networks (GRNNs), density-dependent multiple kernel bandwidths, and regularization. The presented model is generic and substitutes the very large number of bandwidths with a much smaller number of trainable weights that control the regression model. It depends on sets of extracted data density features which reflect the density properties and distribution irregularities of the training data sets. We provide an efficient initialization scheme and a second-order algorithm to train the model, as well as an overfitting control mechanism based on Bayesian regularization. Numerical results show that the proposed network manages to reduce significantly the computational demands of having individual bandwidths, while at the same time, provides competitive function approximation accuracy in relation to existing methods.
Original language | British English |
---|---|
Pages (from-to) | 1683-1696 |
Number of pages | 14 |
Journal | IEEE Transactions on Neural Networks |
Volume | 18 |
Issue number | 6 |
DOIs | |
State | Published - Nov 2007 |
Keywords
- Density based
- Function approximation
- Generalized regression neural network (GRNN)
- Regularization