A hierarchical Bayesian framework for multimodal active perception

João Filipe Ferreira, Miguel Castelo-Branco, Jorge Dias

Research output: Contribution to journalArticlepeer-review

23 Scopus citations

Abstract

In this article, we present a hierarchical Bayesian framework for multimodal active perception, devised to be emergent, scalable and adaptive. This framework, while not strictly neuromimetic, finds its roots in the role of the dorsal perceptual pathway of the human brain. Its composing models build upon a common spatial configuration that is naturally fitting for the integration of readings from multiple sensors using a Bayesian approach devised in previous work. The framework presented in this article is shown to adequately model human-like active perception behaviours, namely by exhibiting the following desirable properties: high-level behaviour results from low-level interaction of simpler building blocks; seamless integration of additional inputs is allowed by the Bayesian Programming formalism; initial 'genetic imprint' of distribution parameters may be changed 'on the fly' through parameter manipulation, thus allowing for the implementation of goal-dependent behaviours (i.e. top-down influences).

Original languageBritish English
Pages (from-to)172-190
Number of pages19
JournalAdaptive Behavior
Volume20
Issue number3
DOIs
StatePublished - Jun 2012

Keywords

  • adaptive behaviour
  • bioinspired robotics
  • emergence
  • hierarchical Bayes models
  • human-robot interaction
  • Multimodal active perception
  • scalability

Fingerprint

Dive into the research topics of 'A hierarchical Bayesian framework for multimodal active perception'. Together they form a unique fingerprint.

Cite this