Qi Dao Prayer
by Helena Wee
Concept
On a circular screen is a structure made of two Calabai-Yau Manifolds1, one inside the other, one black, one white, analogous to yin and yang or Qi and Dao. They are structures postulated in String Theory as being a way of folding multiple dimensions of space-time and as such can be thought of as fundamental structures of the Universe. Mathematically they can have any number of dimensions usually represented by n.
In Chinese mythology Dao is the formless or unconditioned founding the perfection of all things, including technical objects. Qi is the support of Dao, allowing it to be manifested in sensible forms. The combination of Qi and Dao is what produces perfection (in Daoism) or sacred harmony (in Confucianism).
When the user brings his hands together the structure gets smaller, but increases in dimensions. When in prayer position a sound of nature is heard. Through prayer and contemplation one can bring Qi and Dao into alignment. This allows for harmonious Qi Dao relations and the formation of a moral cosmotechnics2, combining nature and the heavens through the Qi of humans.
Technical and Process
I used ofxRapidLib, a machine learning library, to create the classification model for recognising a prayer gesture. I also used ofxMaxim, or Maximillian, to play the sound files. For connection of the Leap Motion sensor I utilised the ofxLeapMotion2 library, but mostly ended up using the Leap Motion SDK functions to aid integration with the sensor, rather than the ofxLeapMotion2 functions. All of these addons need to be installed in order for the project to run. I created the programme on Linux and compiled it using make files.
To recognise a prayer gesture I obtained hand data from the Leap Motion. I chose to use the distance between two hands as the main feature for classification training in the k-nearest neighbour model. Whilst training with examples I tried to ensure that the model had the least possible noise as this can adversely affect the results of a k-nearest neighbour model.
When two hands were more than a threshold distance apart I said this was a no prayer state. When they were very close to each other I called this a prayer state. In this way I was able to track when the prayer gesture had occurred. Tracking the distance between two hands also allowed me to map this to the dimensions of the Calabai-Yau shaped structures and make the structure get bigger or smaller as hands moved further apart or closer together.
When the prayer gesture occurs the Calabai-Yau structures are at their smallest (most dimensions) and an event is triggered. The small floating pink spheres surrounding the main structure had been following a 3D lissajous curve3. When the prayer gesture occurs this then changes so that a different 3D lissajous curve is followed. The spheres move smoothly into their new orbit. A sound from nature is heard when the prayer gesture occurs. There are eight different sounds and each corresponds to a different lissajous curve. They each represent a different trigram element as described in the I Ching4: earth, fire, lake, mountain, sky, thunder, water, wind.
Whilst creating the code for this project I referred to library examples within the addons for ofxLeapMotion25, ofxRapidLib6 and also tutorials we have had on ofxMaxim7. I also found Daniel Shiffman's chapter on "Autonomous Agents"8 in his "Nature of Code" book to be very useful when creating a following behaviour for the floating pink spheres. The Calabai-Yau Manifold shapes had been created by converting a Python script9, originally made for use with Rhino, into OpenFrameworks C++. All references are at the bottom of the page.
Self evaluation and Future Development
Overall I thought the project pretty much ended up doing what I wanted it to in the end. It was, however, quite a challenge getting it all working well together. Using Leap Motion on linux is not as well supported as on other platforms, so I had to use an older SDK version to ensure compatibility which may have slightly affected how well hand motion was detected. In the end although I used the ofxLeapMotion2 addon, I used very few of its native methods, choosing to get finer control by using the SDK C++ methods10 instead. Also there were some issues which altered accuracy of hand detection in window versus fullscreen mode, making it less accurate in fullscreen mode, but I compensated for this by retraining of the model to account for this.
In terms of installation I was originally going to add speakers to the setup and decided later that the computer speakers would be sufficient. I wanted all the electronic components to be visible in the final setup and for them and the table to act like a kind of altar, or lectern at which people could pray. I think in the future though I may try back projecting the screen to make the "altar" cleaner, and to reduce any glare affecting the screen display. This would very much depend on the light conditions in the room at time of install though.
In terms of future developments for the project, I might try to make the transition between Calabai-Yau structures of different sizes smoother by changing the scaling of the structures, but for this project I felt it did not really matter as going from one size to a different size in discrete steps was a nod to going through different levels of a computer game. In summary I felt the project worked well and was almost the same as what I originally envsioned, with a few minor alterations already discussed. I also felt I learnt a lot about Leap Motion and other technologies through doing it, and was happy to have been given a chance to create an interactive piece inspired by an earlier work of mine.
References
1. https://www.nytimes.com/2006/10/17/science/17yau.html
2. Hui, Yuk. “The Question Concerning Technology in China”, Urbanomic, (2016)
3. http://www.mathcurve.com/courbes2d.gb/lissajous/lissajous.shtml and http://www.mathcurve.com/courbes3d/lissajous3d/lissajous3d.shtml
4. https://en.wikipedia.org/wiki/Bagua
5. https://github.com/genekogan/ofxLeapMotion2
6. http://gitlab.doc.gold.ac.uk/rapid-mix/RAPID-MIX_API/tree/master/examples/ofx
7. https://github.com/micknoise/Maximilian
8. http://natureofcode.com/book/chapter-6-autonomous-agents/
9. http://www.tanjiasi.com/surface-design/
10. https://developer.leapmotion.com/documentation/cpp/index.html