Local binary pattern variants-based adaptive texture features analysis for posed and nonposed facial expression recognition

Maryam Sultana, Naeem Bhatti, Sajid Javed, Soon Ki Jung

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

Facial expression recognition (FER) is an important task for various computer vision applications. The task becomes challenging when it requires the detection and encoding of macro- and micropatterns of facial expressions. We present a two-stage texture feature extraction framework based on the local binary pattern (LBP) variants and evaluate its significance in recognizing posed and nonposed facial expressions. We focus on the parametric limitations of the LBP variants and investigate their effects for optimal FER. The size of the local neighborhood is an important parameter of the LBP technique for its extraction in images. To make the LBP adaptive, we exploit the granulometric information of the facial images to find the local neighborhood size for the extraction of center-symmetric LBP (CS-LBP) features. Our two-stage texture representations consist of an LBP variant and the adaptive CS-LBP features. Among the presented two-stage texture feature extractions, the binarized statistical image features and adaptive CS-LBP features were found showing high FER rates. Evaluation of the adaptive texture features shows competitive and higher performance than the nonadaptive features and other state-of-the-art approaches, respectively.

Original languageBritish English
Article number053017
JournalJournal of Electronic Imaging
Volume26
Issue number5
DOIs
StatePublished - 1 Sep 2017

Keywords

  • adaptive texture features
  • facial expression recognition
  • granulometry
  • local binary pattern
  • local binary pattern variants
  • neighborhood size

Fingerprint

Dive into the research topics of 'Local binary pattern variants-based adaptive texture features analysis for posed and nonposed facial expression recognition'. Together they form a unique fingerprint.

Cite this