Communion
‘Communion’ is a live sonic improvised performance piece which speculates on ideas of distributed consciousness and matter. Via the mediums of electrical currents, vibrations and bits, passed through physical and digital bodies, connections are made, explored and manipulated.
produced by: David Williams
Introduction
A small fern, some electrical circuitry, a metal plate, an incense vaporizer emitting wisps of smoke and a computer with a web-cam. In this performance, these are my collaborators. How do I exploit the features of each of these in order to produce sound? Smoke moves through space, and this movement can be harnessed via computational processes to trigger sounds. A plant, like myself, contains moisture, and along with the metal plate, we are conductive – so together we can create capacitance. With the help of some amplifier circuits, feedback can be created, and oscillations generated. These vibrations can be translated into digital form via small piezo microphones, where the sounds can be looped, re-sampled and manipulated in the digital domain. Thus, the stage is set for playful meandering and improvisation in this curious sonic landscape.
Concept and background research
In his introduction to ‘Alien Agency: Experimental Encounters with Art in the Making’, Chris Salter’s starting point “material agency - what the world does rather then is”(Salter, 2015, xii) was highly influential to the creation of ‘Communion’. Conceptually, I am fascinated by ideas of agency and panpsychism – the notion that all matter has some degree of consciousness – that “...thought is neither merely epiphenomenal nor something that exists in a separate realm from the material world. Rather, mind is a fundamental property of matter itself. This means that thinking happens everywhere; it extends all the way down (and also all the way up).”(Shaviro, 2015, 20)
In the early stages of developing ‘Communion’, I continued to think of ways in which I might somehow access this ‘other’ consciousness, or even how one might grant or explore the agency of seemingly inert materials. Early experiments were inspired by a short story by Rudy Rucker – a speculative science-fiction writer in whose short story ‘Panpsychism Proved’, futuristic ‘mind-link’ technology allows one to telepathically link with other entities(in the story, it happens to be a lump of granite)(Rucker, n.d).
It was when I was poking around with electrodes and mini-amplifier circuits – connecting them to the plant, the soil, myself, and seeing what kind of readings I could get from them – that I started to think about some of the more fundamental properties of matter – the electrons that make up our atoms, and how the particular arrangement of these in more complex arrangements allow for properties such as electrical conductivity and capacitance. The shared experience of flowing electrons and the transformation of (via electrical circuits and feedback) into the audible frequency domain, or what experience at we humans perceive as sound. The modern marvel of the ADC – Analog to Digital Converter, allows for another transformation – and in the digital realm the sound as represented by thousands of sample points can be further manipulated by me, the human performer.
A highly influential artistic source was the work of exprerimental composor David Tudor - whom I discovered while reading Nicolas Collins' excellent book, 'Handmade Electronic Music' - a book which has informed and prompted much of the analogue circuitry exploration I conducted in attempting to communicate with the plant and other sources; the book is a wealth of information regarding techniques and artists working in this field and provides one with the confidence and DIY attitude to play around with circuits, make your own piezo pickups etc.. Tudor's work stood out to me though especially - with his highly creative and original (at that time) approach to electronic circuits and sound-making, evident in his most famous work, 'Rainforest' - a work which utilised found objects interconnected by transducer speakers to exploit their resonant properties - re-casting them as unique sculptural/sonic entities .
I’d also been thinking about the randomization or alternatives to conventional rhythm. Many programming languages have a ‘random’ or ‘noise’ function (I won’t go into these now), but I wanted something that had a much more tangible materiality. It was when I was burning some incense sticks that I thought about how I might use the movement of smoke through space in such a way – an elegant (if somewhat crude execution-wise) demonstration of what Salter might dub ‘performative matter’.
Steven Shaviro, in ‘Consequences of Panpsychism’, points out that “panpsychism is easily subjected to derision and ridicule”(Shaviro, 2015, 20) and indeed, to the rigidly empirical scientist, ideas like ‘panpsychism’ and ‘alien agency’ may seem highly frivolous. But if, as artists who may feel a duty to highlight the ‘limits of the anthropos’, and in some way thwart the highly damaging anthropocentric contemporary narrative, we need these alternative world-views, and as Salter points out, “biotechnical, computational and responsive techniques and methods perform and make a different world.”.
In my performance of ‘Communion’, by using an amalgamation of analog electronics and computational techniques I sought to create an intimate, contemplative space, in order to communicate something of the “vibrant materiality of the world beyond the human”.
Technical
Broken down into it’s constituent software parts, the piece consists of three sound sources, all created in the Pure Data (Pd) graphical programming language. I used Pd because of it’s modularity and easy connectivity between different patches (programs) in Pd as well as programs written in other languages via it’s extensive third part libraries which allow for connectivity via Open Sound Control (OSC), MIDI, and serial communication. It’s also geared towards working with audio. Within my main Pd patch, I also utilized a few basic audio effects, such as reverbs, delays, and filters.
The first of these I’ll refer to as the ‘smoke-synth’. The smoke-synth, as alluded to above, uses the movement of smoke to create different chord combinations within a pre-defined half-whole scale – this was the only sound source which utilized equal temperament in some way as I wanted the smoke synth to somewhat more musical. I could have used a more common scale, but I chose the half-whole scale for the simple reason that I enjoyed the variety and feel of the chords that the notes within this scale could produce. At the heart of smoke-synth is a program written in the C++ creative coding toolkit Open Frameworks (OF). This program has two elements. The first of these is for motion detection, utilizing a technique called frame differencing, with a USB web-cam. Put simply, the program compares two frames (usually the current one and the one previous) and the movement is detected in the fact that the pixel values of the moving object will have changed between the two frames. Visually, it is recognizable as displaying a ghostly white outline of the thing that is moving. The program divides the full frame video stream into nine regions, or a 3 x 3 grid. When motion is detected in one of these regions (the sensitivity of which can be adjusted by a threshold variable), a simple binary trigger (0 for no movement, 1 or movement) is activated. The camera is centered on the smoke source and as the smoke moves around, so different cells are triggered. The result of these triggers is then sent via OSC to the synth program in Pd, where they are used trigger notes – with one distinct note value for each cell.
The other half of the OF program utilizes the raw pixel information of the scene viewed by the camera, i.e. the moving smoke and any other objects in view of the camera. Using a built in function which allows access to a sub-section of a given image (or frame), I built a kind of ‘scan line’ program, which takes pixel values from left to right across the image. This can be seen as a histogram type visualization on the screen. Again, using OSC, I sent this number stream to a corresponding program in Pd, where the pixel data (ranging from 0 – 255) is converted to a floating point number between 0 – 1. This information was then used to create the distinct timbre of this synth using Pd’s wave-table oscillator, which is able to read from the graph array of the incoming pixel information, and modulated a small amount with some simple FM (Frequency modulation). In this way the smoke had a twofold effect on the sound – both as a trigger and on the timbre of the sound. The OF program was the only part of the performance which constituted an audio-visual (AV) element, projected as it was behind me on a screen.
The other two sound programs in Pd were created to harness the audio signals produced by two piezo contact microphones. Both have the functionality of sampling incoming audio and playing it back over a given time-line. The first is a basic looper – capable of handling around five seconds of audio and able to over-dub sound – much like a guitar phrase looper. A simple yet effective tool which allows you to create complex layers of sound. The other program, much more like a conventional sampler, did not have an overdub function, but was created with the purpose of allowing me to manipulate the audio sample itself; with controls for changing the start and end point of the sample (to play it backwards, for example), and the time to playback the sample – which has a direct effect on the pitch of the sample. By shortening the sample slice itself I could generate deep, resonant bass sound as the pitch and granularity of the sample was altered. The looper played back audio from it’s punch-in/out points, while the sampler was triggered by a separate 8-step sequencer (also built in Pd).
In order to control the looper and sampler I built a small foot pedal with some push buttons and an arduino micro, this allowed me hands-free control of the audio sampling. For general control of specific parameters (volume, low-pas filters etc.) in the main Pd performance patch I used a Korg Nanokontrol, whose knobs and sliders I’d mapped out for use in Pd. For dynamic stereo panning I created a patch which used Pd’s sin function to pan from left to right.
The analogue circuitry consisted of two small amplifiers powered via USB. Connected to one of these amplifiers was an electrode placed in the soil of the plant. By connecting to the amplifier with just the ‘live’ terminal, I could directly play with the capacitive effect of the plant, myself, and the metal plate used in the performance. The amplifier output was connected to a small transducer speaker – a special kind of speaker which resonates the material it is in contact with (rather than moving a speaker cone like a conventional speaker). The transducer sat on a metal sheet which acted as a resonator, and the ground input of the speaker was wired directly to the metal plate. Three piezo microphones picked up any vibrations – with two going to the computer sound card, and the other going to another amplifier and transducer speaker, also attached to the plate.
Even with no intervention from myself, this hybrid circuit was capable of producing a small amount of sound. But when I grasped the leaves of the plant, the capacitance was much greater and a distinct squealing feedback was generated – this amplified even more so if I touched both the metal plate and the leaves of the plant. The piezo speakers did a good job of picking up these sounds as produced by the transducer speakers, as well as me just tapping or scratching the metal plate. By taking the electrode from the soil and connecting this to the metal plate also, another kind of feedback could be produced. What was interesting was how the state of the plant affected the sensitivity of the capacitive feedback – when it had been left out in the sun and well watered for example or if it had been left in the dark all day.
By touching the plant and metal plate in different ways and then layering and manipulating these samples, I was able to create a rich and varied sonic palette.
Future development
I definitely see this piece as just the beginning of my sonic experiments with plants and other objects. There are numerous ways the piece could be expanded – for example if it was realized as an installation, with multiple plants/ objects/ sound sources and resonators, bigger transducer speakers, spatialized sound etc.
If I was going to develop the visual side of the piece, I would be very careful in how, what and why a particular visual element was used. While flashy AV can be visually stunning/ entertaining, for me it can be difficult to link the audio and visuals conceptually. This is definitely something I continue to struggle with in my practice.
How might sound or other information may be extracted from different entities ? Through analog and digital means, playing and seeking those things at the edge of our human perception – the elusive, the liminal, ‘performative matter’.
Certainly the piece exploited the notion of ‘network’ in the sense that a circuit is a network of connected conductive bodies, but it would be interesting to experiment with different kinds of distributed networks on varying scales, interrogating through practice what the inter-connectedness of things can reveal – but not losing sight of the materiality of process. If anything, through my research and making of this piece of have gained a definite sense of direction as far as a ‘technoscientifically driven’ practice is concerned and am excited about where it might lead me – nudging and prodding the boundaries of knowledge and experience.
Self evaluation
As an artist/performer, you don’t want to spoon-feed your audience, but certainly I could have taken more care in considering the connection between what I decided to project on the screen, and what then was heard. It wasn’t exactly clear how what could be seen on the screen related to the sound. It was self-consciously not an AV performance in the traditional sense, but as soon as one adds additional visual elements to any piece, one has to consider how and if it compliments the piece. Because what was projected was only really one of the sound sources (the ‘smoke-synth’), it was perhaps confusing for the viewer as to how it related to the sound generation. I tried to make this apparent in later performances, by bringing in sonic elements one at a time and giving the audience time to appreciate and understand each one – i.e. to gauge the connection between the smoke moving and the sound being generated, or between my touching the plant and the resulting sound.
The wave-table element though was completely lost – though technically a visualization of a resulting sound wave, visually it looked far too much like a traditional graphic equalizer or something of this nature and so it left the audience guessing as to what it was actually for – in future I could definitely find a way to be more explicit about this particular function – or even make a whole piece focusing on just this element. I made the decision to display what was happening with the frame differencing/ wave-table program as I thought it was important to reveal what this process was and in some ways it’s computational significance – in hindsight though this could have been more considered.
By choosing subdued lighting and using lavender essential oil in the vaporizer I used to create smoke, I was pleased with the kind of intimate, meditative atmosphere I was able to create – I felt that it helped people to really connect with the piece and I deliberately kept the visual elements quite minimal so as not to bombard the audience with unnecessary stuff. As the piece was improvisational, each time I performed it was different – I would create and find new sounds, as well as new ways of ‘arranging’ the piece – for me it became a very exciting process and one I would like to develop and refine in the future.
References
General programming language reference websites:
https://openframeworks.cc/
https://puredata.info/
https://www.arduino.cc/
Helpful Pure Data tutorials:
Making a looper with Pure Data
http://msp.ucsd.edu/techniques/latest/book-html/node69.html
Help with frame differencing:
https://github.com/rychrd/histGram/blob/master/src/ofApp.cpp
https://github.com/micknoise/
Arduino USB/MIDI help:
https://github.com/BlokasLabs/USBMIDI/blob/master/examples/midictrl/midictrl.ino
David Tudor web archive:
https://davidtudor.org/
Playing with electronics:
Collins, N., 2009. Handmade Electronic Music: The Art of Hardware Hacking (2nd Edition). Routledge.
Academic references:
Rucker, R., n.d. Panpsychism Proved [WWW Document]. http://www.rudyrucker.com/transrealbooks/completestories/#_Toc47.
Salter, C., 2015. Alien Agency: Experimental Encounters with Art in the Making. MIT Press.
Shaviro, S., 2015. The Consequences of Panpsychism, in: Grusin, R. (Ed.), The Non-human Turn . University of Minnesota Press.