Stella Paschalidou is a lecturer at the Department of Music Technology and Acoustics of the Hellenic Mediterranean University. She holds a BSc in Physics (Aristotle University, Greece), an MSc in Music Technology (University of York, UK) and a PhD in computational ethnomusicology with an emphasis on the analysis of body movement in music performance (Durham University, UK). Her research interests include music technology, embodied music cognition, motion capture technologies and analysis methods in music, audio interaction, and computational ethnomusicology. In the past she has carried out fieldwork in India collecting audio and movement data in Dhrupad vocal improvisation, a genre of north Indian classical music, in an attempt to discover relationships between vocalists’ hand gestures and their voice. She is especially concerned by the concept of effort in music performance.
Human body movements in music
Traditionally, musical performance is based on movement, that is, it is human bodily movement that produces sound and the body is a vehicle for the expression of the performer’s artistic intent.
This applies not only to acoustic musical instruments, but also to digital ones, i.e. those based on the use of a computer to synthesize digital sound in real time through algorithmic processes. In acoustic musical instruments the relationship between cause (movement) and effect (produced sound) is determined by the laws of physics, i.e. by the way a musical instrument is set into oscillation when stimulated by the movement of the musician.
But what is this relationship in the case of a digital interactive system that produces sound (and not only sound) through a computer, and how can we develop interactive systems that are more expressive and more natural in the gestural human-computer interaction?