TY - GEN
T1 - Ambiance Signal Processing
T2 - 5th International Conference on Web Research, ICWR 2019
AU - Bakhtiyari, Kaveh
AU - Taghavi, Mona
AU - Taghavi, Milad
AU - Bentahar, Jamal
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/4
Y1 - 2019/4
N2 - Computational feature recognition is an essential component for intelligent systems to sense the objects and environments. This paper proposes a novel conceptual model, named Ambiance Signal Processing (AmSiP), to identify objects' features when they are not directly accessible by sensors. AmSiP analyzes the surrounding and ambiance of objects/subjects collaboratively to recognize the object's features instead of concentrating on each individual and accessible object. To validate the proposed model, this study runs an experiment with 50 participants, whose emotional state variations are estimated by measuring the surroundings features and the emotions of other people in the same environment. The results of a t-Test on the data collected from this experiment showed that users' emotions were being changed in a course of time during the experiment; however, AmSiP could estimate subjects' emotions reliably according to the environmental characteristics and similar patterns. To evaluate the reliability and efficiency of this model, a collaborative affective computing system was implemented using keyboard keystroke dynamics and mouse interactions of the users whose emotions were affected by different types of music. In comparison with other conventional techniques (explicit access), the prediction was reliable. Although the developed model sacrifices a minor accuracy, it earns the superiority of uncovering blind knowledge about the subjects out of the sensors' direct access.
AB - Computational feature recognition is an essential component for intelligent systems to sense the objects and environments. This paper proposes a novel conceptual model, named Ambiance Signal Processing (AmSiP), to identify objects' features when they are not directly accessible by sensors. AmSiP analyzes the surrounding and ambiance of objects/subjects collaboratively to recognize the object's features instead of concentrating on each individual and accessible object. To validate the proposed model, this study runs an experiment with 50 participants, whose emotional state variations are estimated by measuring the surroundings features and the emotions of other people in the same environment. The results of a t-Test on the data collected from this experiment showed that users' emotions were being changed in a course of time during the experiment; however, AmSiP could estimate subjects' emotions reliably according to the environmental characteristics and similar patterns. To evaluate the reliability and efficiency of this model, a collaborative affective computing system was implemented using keyboard keystroke dynamics and mouse interactions of the users whose emotions were affected by different types of music. In comparison with other conventional techniques (explicit access), the prediction was reliable. Although the developed model sacrifices a minor accuracy, it earns the superiority of uncovering blind knowledge about the subjects out of the sensors' direct access.
KW - Affective Computing
KW - Ambient Intelligence
KW - Feature Recognition
KW - Human Emotion Recognition
KW - Human-Computer Interaction
UR - http://www.scopus.com/inward/record.url?scp=85069901536&partnerID=8YFLogxK
U2 - 10.1109/ICWR.2019.8765251
DO - 10.1109/ICWR.2019.8765251
M3 - Conference contribution
AN - SCOPUS:85069901536
T3 - 2019 5th International Conference on Web Research, ICWR 2019
SP - 35
EP - 40
BT - 2019 5th International Conference on Web Research, ICWR 2019
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 24 April 2019 through 25 April 2019
ER -