(continued from http://mlab.taik.fi/mediacode/archives/898)
On 7/3/2011, we had the second “Kinect Jam Session” at Media Lab Helsinki.
I started by presenting an example sound application by Ben McChesney, using Xcode/openFrameworks:
http://www.benmcchesney.com/blog/2010/12/open-frameworks-kinect-sound/
1. Forrest Oliphant presentation
Forrest presented work done for the Computational Photography course.
One project consisted of a “carousel” 360 model slitscan. Any motion detected by Kinect warps the 3D model. Processing was used for this, and Forrest mentioned the programming chalenges presented by this type of 3D representation.
Another project consisted of photos and videos of dancers/movement, again using slitscan techniques. The photos have a resolution of 7055×1920 pixels, and compress (or translate) 6 minutes of video. Cinder was used for this project, together with a Photoshop script. Kinect was not intensively used in this project.
Cinder links:
http://libcinder.org
https://github.com/cinder/Cinder
https://github.com/cinder/Cinder-Kinect
Processing links:
http://processing.org/
http://www.shiffman.net/2010/11/14/kinect-and-processing/
Forrest’s page:
http://sembiki.com/
2. Dipti Sonawane presentation
Dipti presented her Interactive Fireworks project, already shown at Media Lab Demo Day in December. It was done with Processing. Dipti mentioned that turning the sound on makes the project run slower.
3. Palash Mukhopadhyay presentation
Palash showed some explorations using openFrameworks and blob detection at different depths. He was surprised by the speed of the prototype application (80fps), despite the heavy calculations involved.
Link to Palash’s Vimeo:
http://vimeo.com/mpalash
4. Glass and IR test
When discussing that TuioKinect-based projects demanded that participants stay within a certain distance of Kinect, Michihito Mizutani posed a question: can glass be used in a IR related project? How does glass affect depth sensing? Palash and Dipti ran a test right away in our “paja” room, and concluded that, despite some artifacts, IR detection works fine through glass.
5. Matti Niinimäki – second presentation
In his second presentation within these “Kinect sessions”, Matti (http://mansteri.com/) showed the “behind the scenes” aspects of the projects he showed in his previous one. In all of these, Quartz Composer was used.
The first project Matti showed was an upcoming medical-related project. An interesting range of gestures can be used to zoom and flip between images.
Then Matti showed the system behind his Animoitu Liike project (animated puppets for children, shown previously), which follows this path:
OSCeleton -> Max/MSP (reformatting OSC messages) -> Quartz Composer (receives OSC) -> Animata (animation)
Regarding the implementation of OpenNI, Matti recomended the following solutions:
Follow the “read-me” at GitHub/OpenNI:
https://github.com/OpenNI/OpenNI/blob/unstable/README
Or follow the instructions here for the Tryplex toolkit:
http://code.google.com/p/tryplex/wiki/Installation
Video about the Tryplex toolkit:
Finally, he showed one more project under development, the forest-related Laserkeilaus, allowing persons to be represented as trees.
6. Inspiration: Moullinex – Catalina music video
To wrap up the session, I showed the Moullinex – Catalina music video (Portuguese band and video directors), produced using Kinect for 3D capture. Thankfully, there were 3D glasses around, so we could watch the video in 3D.
(continue reading about the third Kinectic Jam Session)
One additional tip from Palash: