PHYSARUM MACHINE
An exploration into the expressive morphogenesis of the slime-mould Physarum Polycephalum via computational methods.
produced by: David Williams
Introduction
In PHSARUM MACHINE, I wanted to explore the notion of morphogenesis (1) as 'expressive becoming'(2). My chosen collaborator in this project was the slime-mould Physarum Polycephalum(3), aided by the Raspberry Pi with an attached IR camera. Roughly sixteen hours of growth was captured using a timelapse program running on the pi, condensed into just a few minutes of footage. A program then scans across this footage, layering different stages of growth side by side in a repeated loop.
Concept and background research
The inspiration for this project came from a desire to continue working with non-human entities, as well as a general fascination with this particular organism. As such, my research led me to the collection of essays, 'The Non-human Turn' edited by Richard Grusin (4). In particular I was drawn to Steven Shaviro's 'Consequences of Panpsychism', which evaluates the ontological implications of Panpsychism - the doctrine that "mind is a fundamental property of matter itself" (5), and which does away with the Certesian duality of body being seperate to mind. Panpsychism is something of an underground notion, but has "persisted as a kind of countertendency to the anthropocentrism,and the hierarchical ontologies, of dominant philosophical dogmas"(5).
In many ways then, the slime-mould was the ideal collaborator for such a project, practically as well as conceptually. Not only is Physarum Polycephalum very easy to grow and keep, when it is in it's single celled plasmodial(3) state, it's body literally is it's mind - a de-centralised body-mind network which organises the passing of chemical signals along it's nodes, sensing it's environment, sending out new nodes in the direction of food sources and distributing nutrients throughout it's body.
I knew I wanted to create something which highlighted the slime-moulds unique growth pattern over time, which in some way allowed the viewer to gain access (however limited) to this particular organism's mode of existence. I am particularly indebted to the work and research of the artist Heather Barnett(6) and the website which she started, 'The Slime Mould Collective'(7) was also very useful as a resource for all things slime-mould related.
On a broader level, I was also influenced quite deeply by Donna Haraway's 'Staying with the trouble'(8), which calls for a re-evaluation of dominant and often destructive anthropocentric tendencies in favour of alternative multi-species stories and practices that "might help us to collectively build more livable worlds" (9).
Technical
The slime-mould itself was grown inside an opaque perspex box, the bottom of which was covered with an agar-jelly base - the slime-mould's ideal growing conditions being dark and damp (following advice from the Slime Mould Collective). I laser-cut a hole in the lid of the box the the mini IR camera to sit in. Once the dried sample was placed on the agar, quite quickly it sprung to life and started to branch out towards to the liquid oat food I had placed in blobs around it. The Raspberry Pi, which was running Raspbian, has built in software for cameras attached to it which can be accessed via the command line, in this case I used 'raspistill' which has time-lapse functionality. I followed the instructions here to capture still images in six hour chunks. I then copied this mass of photos over to my main computer and assembled them into a short video using another command line video tool for Linux, 'memcoder'(10).
The program which performed the horizontal and vertical 'slit-scans' was created using Open Frameworks, adapted from the program we built in week six of Theo Papatheodorou's Workshops in Creative Coding class. The program can be found on my git-hub page.
Originally I'd planned to run the program on the Pi itself, but even this relatively simple program proved a bit too demanding graphics-wise and I just couldn't get it to run as smoothly as I wanted. So to keep things hassle-free for the final show, I just had a recording of the program running on a loop on a widescreen monitor, which was displayed alongside an enlarged still image of the slime-mould at the peak of its growth.
Future development
I very much see this project tentative early steps in working with other organisms such as the slime-mould and how one might explore the idea of consciousness and form of other beings via computational methods. One obvious example would be to create a program which seeks to mimic the behaviour of the slime-mould in some way (Not surprisingly, this has been attempted already! (11)). This could be graphical or not, but the very act of trying to understand the behaviour of such an organism and transcribe this behaviour into code would undoubtedly give a deeper insight into the body/mind of this other organism.
One could even abstract certain observable connected behaviours of this organism such as 'space', 'form', 'food', 'location', into other digital programmable spaces beyond the single screen - distributed computer networks, mobile devices, the internet, Wi-Fi, radio etc.; creating theoretical/speculative analogues which could display emergent behaviour in their own right.
So on a practical level, I can see the future development of the ideas in this project taking two possible paths. One would be to delve deeper into developing more complex computer programs which explore organic morphological processes - programming cellular automata, space filling algorithms and object-oriented methods for example would probably be a good place to start.
The second, as above, would be to explore and exploit pre-existing communications and micro-computing technologies and perhaps create my own node-based network of inter-connected digital organisms which exist independently from the functional needs of human-centered networks.
Self evaluation
The process in creating this piece was relatively simple - I wanted to create something which had a compelling look/feel, but really this was a chance for me to experiment with some different technologies and have a go at growing and recording the slime-mould. The project was consciously un-ambitious - I treated it very much as a prototype for future development. As far as the technicalities and programming were concerned - it needed to be simple and it needed to work. This isn't to say that it was all plain sailing. The slime mould performed admirably, but with myself working with the computer there were a few little bumps along the way which ultimately shaped the final form for the exhibition.
I'd gotten a version of the program working on my main computer and I'd even taken it further with the addition of some more complex voronoi-type shape generation with a live video stream. There was also going to be a a mini thermal printer in the mix which would print out a screen shot at a given interval and thus provide an analog record of the slime-mould's growth. I'd tested this aspect on my main computer also - a simple timer function would make the call to print every x minutes and setting up the printer with the Open Frameworks add-on was pretty straightforward.
All of this hinged, however on being able to compile and run the program on the Raspberry Pi. I started off by following the instructions here(12), and hey-presto - I got one of the 2D graphics examples running on the pi with no apparent problems. As soon as I started to introduce additional add-ons and live video though, things took a downward turn(13), (14). The add-on needed to access video on the pi (this was when I was going to have the whole thing running live) seemed to work OK, but as soon as I started to create the bigger video buffers needed for slit-scan type animation to work, the frame rate dropped significantly and the program would regularly freeze. Then when I introduced the thermal printer as well, it would just crash the whole program. The Pi just couldn't access memory fast enough to make the call to print as well as maintain the video buffer. In hindsight, I should have tested this stuff much earlier on and I could have adapted my course of action accordingly i.e. work on a setup which ran the animation from a more powerful computer, but a lot of this came quite late in the day and I just didn't have time to explore ways of getting this kind of program to run well on the Pi, which is why in the end I ended up going for a much more basic setup than originally intended.
It was a last minute decision to use the widescreen monitor, originally I was going to have everything running on s 7" monitor connected to the pi. This actually worked out well - the animation was much stronger as an experience and certainly benefited from being on the larger screen. I was disappointed not to be able to include some of my original ideas though, such as the thermal printer and the live video stream. This was a good learning experience though - notably knowing what works well with the Raspberry P (or not), prototyping and testing much earlier in the game, choosing the right tools for the job etc.
Conceptually speaking, I feel that the final piece made sense according to my aims - the final program/animation was very simple but worked, I felt, quite elegantly. As I mentioned above though, I treated this in some ways like a first prototype - there are numerous ways I could expand upon the project and I very much look forward to doing so in the future. My personal blog details my journey in creating this project.
References
1. Morphogenesis. In: Wikipedia [Internet]. 2018 [cited 2018 Sep 17]. Available from: https://en.wikipedia.org/w/index.php?title=Morphogenesis&oldid=857022259
2. Brian Massumi. The Supernormal Animal. In: Richard Grusin, editor. The Nonhuman Turn. University of Minnesota Press; 2015.
3. Physarum polycephalum. In: Wikipedia [Internet]. 2018 [cited 2018 Sep 17]. Available from: https://en.wikipedia.org/w/index.php?title=Physarum_polycephalum&oldid=859855319
4. Richard Grusin, editor. The Nonhuman Turn. University of Minnesota Press; 2015.
5. Steven Shaviro. Consequences of Panpsychism. In: Richard Grusin, editor. The Nonhuman Turn. University of Minnesota Press;
6. HEATHER BARNETT [Internet]. HEATHER BARNETT. [cited 2018 Sep 17]. Available from: http://heatherbarnett.co.uk/
7. The Slime Mould Collective [Internet]. [cited 2018 Sep 17]. Available from: https://slimoco.ning.com/
8. Donna Haraway. Staying with the Trouble: Making Kin in the Chthulucene. Duke University Press Books; 2016.
9. Martha Kenney. Donna Haraway (2016) Staying with the Trouble: Making Kin in the Chthulucene. Durham: Duke University Press. 312 pages. ISBN: 978-0822362241. Sci Technol Stud [Internet]. 2017;30(2). Available from: https://sciencetechnologystudies.journal.fi/issue/view/4251
10. Time-lapse - Raspberry Pi Documentation [Internet]. [cited 2018 Sep 17]. Available from: https://www.raspberrypi.org/documentation/usage/camera/raspicam/timelapse.md
11. Added by Safonov Alexander on August 23 2016 at 11:15, Videos V. Physarum computing and topology optimisation (12 nodes) [Internet]. [cited 2018 Sep 17]. Available from: https://slimoco.ning.com/video/physarum-computing-and-topology-optimisation-12-nodes
12. Compiling OF in raspbian Stretch [Internet]. openFrameworks. 2017 [cited 2018 Sep 17]. Available from: https://forum.openframeworks.cc/t/compiling-of-in-raspbian-stretch/27562/32
13. Vivo PG. OpenFrameworks library to control mini thermal printers (https://www.adafruit.com/products/597): patriciogonzalezvivo/ofxThermalPrinter [Internet]. 2018 [cited 2018 Sep 17]. Available from: https://github.com/patriciogonzalezvivo/ofxThermalPrinter
14. Cleave JV. development of ofxRPiCameraVideoGrabber, an addon to control the native Raspberry Pi camera in openFrameworks: jvcleave/ofxRPiCameraVideoGrabber [Internet]. 2018 [cited 2018 Sep 17]. Available from: https://github.com/jvcleave/ofxRPiCameraVideoGrabber
15. IS71014B WCC1 (2017-18): slit scanner [Internet]. [cited 2018 Sep 17]. Available from: https://learn.gold.ac.uk/mod/page/view.php?id=475703