LaserDraw 2.0
This is an interactive installation that lets you "draw" with a laser pointer and a projector. At the core of this system is a program that isolates and tracks what's happening on the projection screen that can be used for much more.
produced by: Keita Ikeda
Concept and background research
This is an extension of my installation LaserDraw, which lets the viewer "draw" with a laser pointer using a projector. I implemented a similar system with the proprietary node-based programming environment Isadora. In this project, I wanted to implement the system myself so that I could have a lower-level access to its inner workings, which would enable me to utilise it for future projects.
The inspiration for this piece came from two pieces of work. One is EyeWriter by Graffiti Research Lab. This piece enabled graffiti artists with ALS to "draw" on buildings again, using eye tracking technology and a projector. The second piece is Apparition by Klaus Obermaier and Ars Electronica Futurelab. It uses body projection, a system that tracks bodies on stage and projects on those bodies in isolation.
There is an abundance of works that use interactive projection, but these pieces stood out to me. I was drawn to the way in which they engaged with the physicality of the users and performers, and the space in which these works take place, not just the projection screen. The interaction seemed somehow more physical and involved this way, compared to the pieces that, say, simply tracks movements of the viewers and projects on a screen in front of them. It's the precision of the eye tracking and the fact that the projection is on the dancers; the projection and the participants feel more integrated, in that the intent of the user/performer is strongly reflected in the output.
I wanted to build a dynamic system that enabled me to do similar things -- to track what's happening on the projection screen and the space. For this project, I built a drawing app similar to EyeWriter, but with a laser pointer, and some added features. I wanted the piece to feel more ephemeral, so I made the points in the lines disappear when the number of points exceeds a threshold. Each point also draws lines to every other point, creating a snake-like movement and fan-like aesthetic.
Technical
At the core of this system is a class that I wrote, called Cutout. It enables the user to pass the webcam input as a pixel array, and set an area (a quad) within that input to be output as a rectangular image, either as a pixel array or as an ofImage object. Using the class method setQuad(), the user can define a quad by simply dragging the four corners of the quad, indicated by the red lines and circles, to their desired location. The method setCutout() processes this information by mapping and interpolating between 2D vectors set by the user. The resulting pixel array can be drawn on screen with the draw() method. It can also be accessed with getCutoutPixels() for further processing. The class has some utility methods, like getWidth() and getTopLeft().
Given the aspect ratio of the input and the output pixel arrays are the same, this class can be used to accurately map what's happening on the projection screen (that's not done by the program) to the output from the projector. It's the inverse of ofxPiMapper, if you like. In this project, I took the cutout's brightest pixel and added a point to the line at its location to give the illusion of drawing with a laser pointer.
This class allows for a rudimentary body tracking as well. The logic is quite simple: if there's a shadow on the projection screen, there must be an object in front of the projector. For the demo below, I created a mask whose pixels are transparent where shadows are by looking at the brightness of the pixels. The content can then be drawn behind this mask, resulting in the program only projecting on objects in front of the screen.
Future development
I specifically undertook this project with the intention to build a system for potential future development. I would like to explore different aesthetics and systems in which this program can be used. I noticed that a collaboration with a dancer/performer would benefit the body tracking element, for obvious reasons. I simply didn't have time to develop this aspect further.
Self evaluation
I am happy with the system that I built; it does everything as I initially intended. Having said that, my focus was given primarily to the technical implementation at the cost of creative development.
References
Graffiti Research Lab. EyeWriter. http://eyewriter.org/
Klaus Obermaier and Ars Electronica Futurelab (2012) Apparition. http://www.exile.at/apparition/