Sonic Gesture
This is an interactive programme translating hand gestures into sounds, developed as an audio-visual component for live performances.
produced by: Adam He
The primary motive of this project originated from my interests in multiple interrelated topics including gestural expression, data-driven sofinication, instrument design and audio-visual performance. The essential common concern of these topics is cross-media translation and interaction, through which our perception and expression can be extended. A software programme, therefore, has been developed, acting as the technical infrastructure mediating such translation and interaction between gestures, images, and sounds.
A similar project I found after research was the Light Wall System developed by Grame. Although so far no technical detail of it has been released to the public, it can be inferred from the demonstration videos that the system roughly consists of three parts, which are of the lighting, the motion capture, and the sonification. In action, first the specifically designed lighting highlights the performers’ movements that are inside the “light wall”, then a camera captures the highlighted movements and sends the data to a software programme, which does the real-time sonification based on the data it receives. While the Light Wall System seems to be sophisticated, where the relationships between the performers’ movements and the sounds output are so diverse, from the videos I presume that the diversity is partly based on pre-arranged choreography. For example, the similar movements can trigger totally different sounds just because they occur in different parts of the space or different phases of the show, and the sonic diversity highly relied on audio samples. Such relationships between gestures and sounds are technically arbitrary, meaning the performer controls the sounds in a somewhat linguistic way.
Different from the Light Wall System, in my project, I aimed to develop a more intuitive, straight-forward, low-level, and physical relationship between movements and sounds, something more universal and instrumental, like what we experience when sweeping our hands to sound the air. Additionally, in contrast to using samples, the sounds should be entirely generated from the ground up because that reflects the similar purely generative mechanism of the acoustics we experience in the real world. Meanwhile, I have also been interested in the communicative function of hand gestures. Putting aside sign language, we often perceive hand gestures as the second important body expression, which is a cross-cultural way of communication. In conjunction with my interest in audio-visual performance, I wanted to explore how to magnify such expression into impressive images and sounds.
The programme makes use of the data transmitted from the camera, calculates the frame differences, which are then used to characterise the sounds in real time. While the idea of building interaction between gestures and sounds is so simple, it took me a lot of effort to fine-tune the parameters in order to achieve a responsive and intuitive experience in performing. Furthermore, despite the fact that the programme was developed to output audio-visual materials, I found it is interesting to use the programme solely for sound by simply disabling the images. In such a case, without visual distraction, the performer can feel stronger the subtlety and responsiveness in controlling the sounds, which is what I think an important aspect of a good instrument to play.
References
Grame. Available at: http://www.grame.fr/ (Accessed: 10 May 2019).
‘Light Music’ thierry De Mey (2004) (no date) Vimeo. Available at: https://vimeo.com/24453131 (Accessed: 10 May 2019).
Addons Used
ofxPS3EyeGrabber, by Baker, C. https://github.com/bakercp/ofxPS3EyeGrabber
ofxMaxim, by Grierson, M. https://github.com/micknoise/Maximilian
ofxOpenCv, by openFrameworks community. https://github.com/openframeworks/openFrameworks/tree/master/addons/ofxOpenCv
ofxGui, by openFrameworks community. https://github.com/openframeworks/openFrameworks/tree/master/addons/ofxGui