TY - JOUR
T1 - Discriminating Fake and Real Smiles Using Electroencephalogram Signals with Convolutional Neural Networks
AU - Moussa, Mostafa M.
AU - Tariq, Usman
AU - Al-Shargie, Fares
AU - Al-Nashash, Hasan
N1 - Funding Information:
This work was supported in part by the Research Grant EFRG18-BBR-CEN-02, Grant EFRG-EN0244, and Grant FRG20; and in part by the Open Access Program Grant through the American University of Sharjah, United Arab Emirates, under Grant OAP22-CEN-102.
Publisher Copyright:
© 2013 IEEE.
PY - 2022
Y1 - 2022
N2 - Genuineness of smiles is of particular interest in the field of human emotions and social interactions. In this work, we develop an experimental protocol to elicit genuine and fake smile expressions on 28 healthy subjects. Then, we assess the type of smile expressions using electroencephalogram (EEG) signals with convolutional neural networks (CNNs). Five different architectures (CNN1, CNN2, CNN3, CNN4, and CNN5) were examined to differentiate between fake and real smiles. We transform the temporal EEG signals into normalized gray-scale images and perform three-way classification to classify fake smiles, genuine smiles, and neutral expressions in the form of subject-dependent classification. We achieved the highest classification accuracy of 90.4% using CNN1 for the full EEG spectrum. Likewise, we achieved classification accuracies of 87.4%, 88.3%, 89.7%, and 90.0% using Beta, Alpha, Theta, and Delta EEG bands respectively. This paper suggests that CNNs models, widely used in image classification problems, can provide an alternative approach for smile detection from physiological signals such as the EEG.
AB - Genuineness of smiles is of particular interest in the field of human emotions and social interactions. In this work, we develop an experimental protocol to elicit genuine and fake smile expressions on 28 healthy subjects. Then, we assess the type of smile expressions using electroencephalogram (EEG) signals with convolutional neural networks (CNNs). Five different architectures (CNN1, CNN2, CNN3, CNN4, and CNN5) were examined to differentiate between fake and real smiles. We transform the temporal EEG signals into normalized gray-scale images and perform three-way classification to classify fake smiles, genuine smiles, and neutral expressions in the form of subject-dependent classification. We achieved the highest classification accuracy of 90.4% using CNN1 for the full EEG spectrum. Likewise, we achieved classification accuracies of 87.4%, 88.3%, 89.7%, and 90.0% using Beta, Alpha, Theta, and Delta EEG bands respectively. This paper suggests that CNNs models, widely used in image classification problems, can provide an alternative approach for smile detection from physiological signals such as the EEG.
KW - convolutional neural networks (CNNs)
KW - electroencephalogram (EEG)
KW - emotion
KW - machine learning
KW - Smile
UR - http://www.scopus.com/inward/record.url?scp=85135761601&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2022.3195028
DO - 10.1109/ACCESS.2022.3195028
M3 - Article
AN - SCOPUS:85135761601
SN - 2169-3536
VL - 10
SP - 81020
EP - 81030
JO - IEEE Access
JF - IEEE Access
ER -