Discriminating Fake and Real Smiles Using Electroencephalogram Signals with Convolutional Neural Networks

Mostafa M. Moussa, Usman Tariq, Fares Al-Shargie, Hasan Al-Nashash

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Genuineness of smiles is of particular interest in the field of human emotions and social interactions. In this work, we develop an experimental protocol to elicit genuine and fake smile expressions on 28 healthy subjects. Then, we assess the type of smile expressions using electroencephalogram (EEG) signals with convolutional neural networks (CNNs). Five different architectures (CNN1, CNN2, CNN3, CNN4, and CNN5) were examined to differentiate between fake and real smiles. We transform the temporal EEG signals into normalized gray-scale images and perform three-way classification to classify fake smiles, genuine smiles, and neutral expressions in the form of subject-dependent classification. We achieved the highest classification accuracy of 90.4% using CNN1 for the full EEG spectrum. Likewise, we achieved classification accuracies of 87.4%, 88.3%, 89.7%, and 90.0% using Beta, Alpha, Theta, and Delta EEG bands respectively. This paper suggests that CNNs models, widely used in image classification problems, can provide an alternative approach for smile detection from physiological signals such as the EEG.

Original languageBritish English
Pages (from-to)81020-81030
Number of pages11
JournalIEEE Access
Volume10
DOIs
StatePublished - 2022

Keywords

  • convolutional neural networks (CNNs)
  • electroencephalogram (EEG)
  • emotion
  • machine learning
  • Smile

Fingerprint

Dive into the research topics of 'Discriminating Fake and Real Smiles Using Electroencephalogram Signals with Convolutional Neural Networks'. Together they form a unique fingerprint.

Cite this