Puppetmation IR!
Puppetmation is an accessible animation platform for anyone! Just put on the sock puppet, and you'll be able to track the puppet's movement and mouth opening and closing via computer vision using infrared sensors. Everyone should be able to make fun and cute custom animations--so as long as you can use a sock puppet, you can, too!
produced by: Jamie Sichel
Introduction
As a child, my sister and I loved putting on puppet shows for our family (unsuspecting dog included). Unsurprisingly, I was also obsessed with cartoons, building blocks, and making movies. As a family we’ve always been big into celebrating things--and my mother never misses an opportunity to send us a personalized e-card, and without fail those intended for me have some sort of talking animal or monster on them.
With these thoughts in mind, I wanted to design a puppet that could allow anyone to custom animate an on-screen puppet or avatar. A physical and fun object on its own, made even more fun by allowing the user to digitally create and send anything from full-fledged dramatic productions to greeting cards.
Concept and background research
From childhood to modern day, I found inspiration from puppets, digital greeting cards, and simple animations.
As a child I used to put on productions using a set of puppets from Mr. Rogers' Neighborhood. With these simple hand puppets in mind I decided to make a sock puppet--something most people see and automatically know how to use.
Jim Henson's Muppets have always played a big role in my life, and for this particular project I drew inspiration from Kermit the Frog and his malleable, smushy face. Originally I had wanted the capacitive sensors in the puppet's mouth to control the shape of the mouth even more, but ultimately my puppet was much less flexible than Kermit, and the extra sensors didn't serve much purpose. Future iterations will attempt to fix this issue.
Cute and silly animations like Neil Cicierega's Potter Puppet Pals, as well as silly voice animated e-cards found on sites like Blue Mountain helped me choose the visual style for my on-screen puppets.
Technical
I wanted this project to be as simple as possible, or at least seem that way. The puppet is just that--a sock puppet, made from felt, ping-pong balls, and a fuzzy sock. I followed a wonderful tutorial by Ana DIY Crafts to create Orangthany (the purple puppet) and Notyet (the blue prototype puppet).
Since one of the stipulations of this project was to have only one input source, I chose my camera--I fitted the puppet with two infrared LEDs and modified a PlayStation Eye camera to only see infrared light. This way, the camera can easily track the motion of the puppet as well as the opening and closing of the mouth.
Originally I attempted to use color tracking for the puppet's position and mouth movements, but color tracking is too dependent on external light and other factors, including the quality of the camera. I used the openCV library in openFrameworks to find a desired HSV color value, create an array of blobs of that color value, take the largest 3 blobs (in this case each eyebrow and the bowtie, usually), average their centers to create one value, and translate the puppet based on that value. I also mapped the image of the body of the avatar as a texture so that I could bend the image and make its movement look a little more realistic. I attempted to have it open the mouth when it saw a 4th blob, which would be the mouth, but again, this was too inconsistent so I switched to infrared.
Future development
For future development, I intend to add the emotions I initially intended on having (before I had to scope back the project), more avatar options, as well as some buttons for blinking and even more controllable options, if the user desires. The puppet will be usable without these extras, and I'd like to have settings the user can tick if they, say, want the puppet to blink intermittently or don't want to track movement.
I'd also like to explore different materials in order to make the mouth a little more malleable, so the capactive sensors that are installed (but not used for this project) control the mouth shape more accurately.
Self evaluation
As a technical demo, I believe this project was successful. The IR LEDs worked incredibly well and, after a lot of tinkering, I was able to successfully modify the PlayStation Eye.
I think my project is whimsicle, fun, and accessible. I acheived what I set out to, but would definitely like to take this project further and create a program with saving options, more puppet avatars, and a streamlined UI. When testing this project, the look on the tester's face was one of joy, and because of that I believe I've acheived my goal.
References
Puppet Tutorial: http://anadiycrafts.com/dog-sock-puppet/
PlayStation Eye Modification Tutorials: https://www.youtube.com/watch?v=0K6xgTmFI2E | http://wiki.lofarolabs.com/index.php/Removing_the_IR_Filter_from_the_PS3_Eye_Camera
Daniel Shiffman Coding Train Lesson 11: https://www.youtube.com/playlist?list=PLRqwX-V7Uu6aG2RJHErXKSWFDXU4qo_ro
Examples from ofxOpenCV, ofxCV, and ofxPS3EyeGrabber
Puppet Photos: https://www.ebay.com/itm/6-Vintage-Hand-Puppets-Mr-Rogers-Neighborhood-King-Red-Riding-Hood-Wolf-Children-/283437506399
Kermit: https://www.youtube.com/watch?v=mIeRRuO09i4
Potter Puppet Pals: https://lemondemon.fandom.com/wiki/Potter_Puppet_Pals