Dripping is not only a sound toy but also performance.
produced by: XInyu Sun
About this work, The sound output can be carried out by setting the button switch of each soundbox, and the details of the sound output can be controlled by setting the parameters of each link, and the video playback can be triggered at the same time. The origin of the name of Dripping is determined according to the combination of the video's theme and the dynamic process of the ink drip in the video. In the production of sound, it approaches the video’s theme, and the parameters setting of different rhythms and sound boxes make the sound variable. At the same time, this is also a patch that can be performed, which will be displayed in the video.
the basic idea
First of all, I have some inspiration in Axel Bluhme's work Durum Machine XOXX Composer, and XOXX Composer transforms the internal functions of sampling, looping, and sorting into tangible, dynamic, sculptural forms, wheel rotation, and magnet triggering sounds. The result is a mechanical patterned sound. The author of this work aims to link rhythm creation and construction in an electronic environment. In a simple way which can stimulate the audience's curiosity and creativity, it is ensured that you can quickly create a unique beat.
Kjell Wistoff's work, Midi Wonder, Rage Agains The Machine, also gave me some inspiration. he interpreted the project as a soundtrack of his own playing, with a 12-piano string Clavinet controlled by a solenoid motor, two drumsticks filled with rice, automatic playback equipment of military drums and kicking drums, and soundtrack music from the song named Superstition to bring all sounds into harmony. In my work, Dripping, I draw on this pattern of multi-tone sound.
I got inspiration from the audio-visual video work about ink dynamics that I filmed and edited last year. The video is composed of the dynamic ink, the interaction between ink and brush, and the vibration of water. I combined video with this work and some sound effects with specific video clips, which triggered the video as it was triggered. I also got inspiration from my interactive installation, canon, which allowed viewers to create their own canon music in a canon-based arrangement that set the timeline and provided 256 buttons for interaction. Dripping also forms the most important control mode of the sound toy by setting the time axis and the way the button is clicked to trigger the sound effect. In terms of sound effects, I use a synthesizer to make sound, to align the style of sound with the style of video, making the play of the connection of sound and video more harmoniously. In the operation of sound toys, it provides a lot of variable operating space. Including the control of rhythm, selection of sound effects, and the opening and closing of the button that triggers sound and volume and etc. in the timeline.
Details on the technical implementation
The main technical implementation of this work includes some of the following settings: firstly, a space button is set for the start of the sound toy, and the escape button stops. Secondly, the timeline of each "sound box" is set by "counter" and "select", and the space available for point-and-click operation is formed by setting and connecting "toggle", "button", and "gate" in the "sound box". Thirdly, set a number or "gswitch" under the "sound box" to select the sound effects placed in playlist. Fourthly, use counter under the "sound box" to connect video clips to achieve the effect of triggering specific video clips by specific sound effects. Fifthly, each sound box can control its volume separately. As a performance, the specific operation is to control the rhythm by adjusting the parameters of the control tempo after the start-up, and then through the button to control the frequency at which the sound effect is triggered in the timeline, and adjust the volume and other sound box parameters to perform.
What the audience will experience if it were publicly presented
This work is presented in two different ways. The first is that it can be provided to the viewer as an interactive installation. In this way, the audience can control all variable parameters for the combination of sounds, create different music and adjust details. For the subsequent operation, this patch can be used as a framework to produce an entity and connect with sensors (buttons, pressure sensors, etc.), such as the production of physical installations and the installation of operable entity buttons, which makes the interactive feelings better and more fun and the display effect is also better than the experience of operating on a computer.
Secondly, it can be displayed as a performance. Through the combination of video and music and real-time operation, the audience can experience different combinations of sounds during the performance, as well as a step-by-step process of rhythm changes.
What worked? What didn't? What would you have done differently? Your overal impression of your project. Did you achieve what you set out to achieve? Why? Why not? Acknowledge your mistakes for better grades...