I Lost My Body
Players need to mimic the disparate hand from the original film and "walks" on the tablet to chase the fly by the spatial sound leading.
produced by: Ho Yin Wong
A disparate hand and a fly moving around, a famous scene from the film "I Lost My Body". A good film gives you an aftertaste, a good game adapted from a film can make it taste last longer. I hope the players can extend their feelings and connect to the character's thoughts through the reproduced scene and the gameplay that offers a disparate hand experience.
Concept and background research
The artwork I selected is an animation called ‘I Lost My Body’. It is a story of a hand's adventure to get back its owner’s body. I found that the idea is interesting and the way it presented the hand’s characteristic is full of creativity. More, the art style, music, and composition are equally great. Those are the reasons why I chose it as my reference to reimagine.
I used the addon ofxOsc and OfxMaxim. ofxOsc receiving the tuio data from the application on the Ipad which is a pre-developed app can be used directly. These tuio data are used to control the view of the 3D world. The most challenging part is probably the sound, I do not have much experience with sound. Especially, this time is making spatial audio to locate the fly. Also how to recognize the gesture is equally difficult.
At the moment, there are limited interactions only. As for the future, I would like to make the piece more complete by adding more features that interesting gameplay should have such as game levels, the fly moves organically even suddenly accelerate. Although the enhancement may seem subtle, still affects the user's experience a lot.
With the last term experience, I spared more time for the module and final project but still ran out of time. I worked on a freelance job to make living and some voluntary works for my diaspora at the same time. Therefore I did comprise in the functionality of the project in order to finish it in time. The outcome is slightly different from the initial idea in every detail. I did not have adequate time to study the technical difficulties I have encountered. For example, I would like to recognize the gesture in a machine learning way that should allow users "walk" like the character with their fingers and the application still reads the input correctly. However, I did not manage to export the trained model and there are resources online so I leave it off. Then directly use the tuio data as inputs to control the 3D view instead.