TY - JOUR
T1 - A Natural User Interface for Gestural Expression and Emotional Elicitation to Access the Musical Intangible Cultural Heritage
AU - Volioti, Christina
AU - Manitsaris, Sotiris
AU - Hemery, Edgar
AU - Hadjidimitriou, Stelios
AU - Charisis, Vasileios
AU - Hadjileontiadis, Leontios
AU - Katsouli, Eleni
AU - Moutarde, Fabien
AU - Manitsaris, Athanasios
N1 - Funding Information:
The research leading to these results has received funding from the European Union, Seventh Framework Programme (FP7-ICT-2011-9) under grant agreement no. 600676. We would also like to thank Frédéric Bevilacqua for giving us his permission to use the “Gesture Follower” in testing our hypothesis.
Funding Information:
This work is supported by the Widget Corporation Grant #312-001. Authors’ addresses: C. Volioti, Multimedia, Security and Networking Lab, Department of Applied Informatics, University of Macedonia, Thessaloniki, Greece; email: [email protected]; S. Manitsaris, Centre for Robotics, MINES ParisTech, PSL Research University, Paris, France; email: [email protected]; E. Hemery, Centre for Robotics, MINES ParisTech, PSL Research University, Paris, France; email: [email protected]; S. Hadjidimitriou, Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, Thessaloniki, Greece; email: [email protected]; V. Charisis, Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, Thessaloniki, Greece; email: [email protected]; L. Hadjileontiadis, Department of Electrical and Computer Engineering, Aristotle University of Thessaloniki, Thessaloniki, Greece, and Department of Electrical and Computer Engineering, Khalifa University of Science and Technology, Abu Dhabi, UAE; email: [email protected]; E. Katsouli, Multimedia, Security and Networking Lab, Department of Applied Informatics, University of Macedonia, Thessaloniki, Greece; email: [email protected]; F. Moutarde, Centre for Robotics, MINES ParisTech, PSL Research University, Paris, France; email: [email protected]; A. Manitsaris, Multimedia, Security and Networking Lab, Department of Applied Informatics, University of Macedonia, Thessaloniki, Greece; email: [email protected]. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. © 2018 ACM 1556-4673/2018/04-ART10 $15.00 https://doi.org/10.1145/3127324
Publisher Copyright:
© 2018 ACM.
PY - 2018/4/12
Y1 - 2018/4/12
N2 - This article describes a prototype natural user interface, named the Intangible Musical Instrument, which aims to facilitate access to knowledge of performers that constitutes musical Intangible Cultural Heritage using off-the-shelf motion capturing that is easily accessed by the public at large. This prototype is able to capture, model, and recognize musical gestures (upper body including fingers) as well as to sonify them. The emotional status of the performer affects the sound parameters at the synthesis level. Intangible Musical Instrument is able to support both learning and performing/composing by providing to the user not only intuitive gesture control but also a unique user experience. In addition, the first evaluation of the Intangible Musical Instrument is presented, in which all the functionalities of the system are assessed. Overall, the results with respect to this evaluation were very promising.
AB - This article describes a prototype natural user interface, named the Intangible Musical Instrument, which aims to facilitate access to knowledge of performers that constitutes musical Intangible Cultural Heritage using off-the-shelf motion capturing that is easily accessed by the public at large. This prototype is able to capture, model, and recognize musical gestures (upper body including fingers) as well as to sonify them. The emotional status of the performer affects the sound parameters at the synthesis level. Intangible Musical Instrument is able to support both learning and performing/composing by providing to the user not only intuitive gesture control but also a unique user experience. In addition, the first evaluation of the Intangible Musical Instrument is presented, in which all the functionalities of the system are assessed. Overall, the results with respect to this evaluation were very promising.
KW - emotional status
KW - evaluation
KW - Gesture recognition
KW - sonification
UR - http://www.scopus.com/inward/record.url?scp=85087409359&partnerID=8YFLogxK
U2 - 10.1145/3127324
DO - 10.1145/3127324
M3 - Article
AN - SCOPUS:85087409359
SN - 1556-4673
VL - 11
JO - Journal on Computing and Cultural Heritage
JF - Journal on Computing and Cultural Heritage
IS - 2
M1 - 10
ER -