Density-driven generalized regression neural networks (DD-GRNN) for function approximation

John Y. Goulermas, Panos Liatsis, Xiao Jun Zeng, Phil Cook

Research output: Contribution to journalArticlepeer-review

50 Scopus citations

Abstract

This paper proposes a new nonparametric regression method, based on the combination of generalized regression neural networks (GRNNs), density-dependent multiple kernel bandwidths, and regularization. The presented model is generic and substitutes the very large number of bandwidths with a much smaller number of trainable weights that control the regression model. It depends on sets of extracted data density features which reflect the density properties and distribution irregularities of the training data sets. We provide an efficient initialization scheme and a second-order algorithm to train the model, as well as an overfitting control mechanism based on Bayesian regularization. Numerical results show that the proposed network manages to reduce significantly the computational demands of having individual bandwidths, while at the same time, provides competitive function approximation accuracy in relation to existing methods.

Original languageBritish English
Pages (from-to)1683-1696
Number of pages14
JournalIEEE Transactions on Neural Networks
Volume18
Issue number6
DOIs
StatePublished - Nov 2007

Keywords

  • Density based
  • Function approximation
  • Generalized regression neural network (GRNN)
  • Regularization

Fingerprint

Dive into the research topics of 'Density-driven generalized regression neural networks (DD-GRNN) for function approximation'. Together they form a unique fingerprint.

Cite this