Generalized regression neural networks with multiple-bandwidth sharing and hybrid optimization

John Y. Goulermas, Xiao Jun Zeng, Panos Liatsis, Jason F. Ralph

Research output: Contribution to journalArticlepeer-review

25 Scopus citations


This paper proposes a novel algorithm for function approximation that extends the standard generalized regression neural network. Instead of a single bandwidth for all the kernels, we employ a multiple-bandwidth configuration. However, unlike previous works that use clustering of the training data for the reduction of the number of bandwidths, we propose a distinct scheme that manages a dramatic bandwidth reduction while preserving the required model complexity. In this scheme, the algorithm partitions the training patterns to groups, where all patterns within each group share the same bandwidth. Grouping relies on the analysis of the local nearest neighbor distance information around the patterns and the principal component analysis with fuzzy clustering. Furthermore, we use a hybrid optimization procedure combining a very efficient variant of the particle swarm optimizer and a quasi-Newton method for global optimization and locally optimal fine-tuning of the network bandwidths. Training is based on the minimization of a flexible adaptation of the leave-one-out validation error that enhances the network generalization. We test the proposed algorithm with real and synthetic datasets, and results show that it exhibits competitive regression performance compared to other techniques.

Original languageBritish English
Pages (from-to)1434-1445
Number of pages12
JournalIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Issue number6
StatePublished - Dec 2007


  • Bandwidth reduction
  • Clustering
  • Function approximation
  • Generalized regression neural network (GRNN)
  • Local and global optimization
  • Neural nets
  • Pattern clustering


Dive into the research topics of 'Generalized regression neural networks with multiple-bandwidth sharing and hybrid optimization'. Together they form a unique fingerprint.

Cite this