Hyperbox-based machine learning algorithms: a comprehensive survey

Thanh Tung Khuat, Dymitr Ruta, Bogdan Gabrys

Research output: Contribution to journalArticlepeer-review

17 Scopus citations


With the rapid development of digital information, the data volume generated by humans and machines is growing exponentially. Along with this trend, machine learning algorithms have been formed and evolved continuously to discover new information and knowledge from different data sources. Learning algorithms using hyperboxes as fundamental representational and building blocks are a branch of machine learning methods. These algorithms have enormous potential for high scalability and online adaptation of predictors built using hyperbox data representations to the dynamically changing environments and streaming data. This paper aims to give a comprehensive survey of the literature on hyperbox-based machine learning models. In general, according to the architecture and characteristic features of the resulting models, the existing hyperbox-based learning algorithms may be grouped into three major categories: fuzzy min–max neural networks, hyperbox-based hybrid models and other algorithms based on hyperbox representations. Within each of these groups, this paper shows a brief description of the structure of models, associated learning algorithms and an analysis of their advantages and drawbacks. Main applications of these hyperbox-based models to the real-world problems are also described in this paper. Finally, we discuss some open problems and identify potential future research directions in this field.

Original languageBritish English
Pages (from-to)1325-1363
Number of pages39
JournalSoft Computing
Issue number2
StatePublished - Jan 2021


  • Clustering
  • Data classification
  • Fuzzy min–max neural network
  • Hybrid classifiers
  • Hyperboxes
  • Membership function
  • Online learning


Dive into the research topics of 'Hyperbox-based machine learning algorithms: a comprehensive survey'. Together they form a unique fingerprint.

Cite this