Kinectic Jam Session 2


IMG_0314 IMG_0317 IMG_0321 IMG_0322 IMG_0326 IMG_0330 IMG_0336 IMG_0333 IMG_0343 IMG_0345 IMG_0346 IMG_0349 IMG_0348 IMG_0352 IMG_0354 IMG_0355 IMG_0361 IMG_0359 IMG_0363 IMG_0366

(continued from

On 7/3/2011, we had the second “Kinect Jam Session” at Media Lab Helsinki.

I started by presenting an example sound application by Ben McChesney, using Xcode/openFrameworks:

1. Forrest Oliphant presentation

Forrest presented work done for the Computational Photography course.

One project consisted of a “carousel” 360 model slitscan. Any motion detected by Kinect warps the 3D model. Processing was used for this, and Forrest mentioned the programming chalenges presented by this type of 3D representation.

Another project consisted of photos and videos of dancers/movement, again using slitscan techniques. The photos have a resolution of 7055×1920 pixels, and compress (or translate) 6 minutes of video. Cinder was used for this project, together with a Photoshop script. Kinect was not intensively used in this project.

Cinder links:

Processing links:

Forrest’s page:

2. Dipti Sonawane presentation

Dipti presented her Interactive Fireworks project, already shown at Media Lab Demo Day in December. It was done with Processing. Dipti mentioned that turning the sound on makes the project run slower.

3. Palash Mukhopadhyay presentation

Palash showed some explorations using openFrameworks and blob detection at different depths. He was surprised by the speed of the prototype application (80fps), despite the heavy calculations involved.

Link to Palash’s Vimeo:

4. Glass and IR test

When discussing that TuioKinect-based projects demanded that participants stay within a certain distance of Kinect, Michihito Mizutani posed a question: can glass be used in a IR related project? How does glass affect depth sensing? Palash and Dipti ran a test right away in our “paja” room, and concluded that, despite some artifacts, IR detection works fine through glass.

5. Matti Niinimäki – second presentation

In his second presentation within these “Kinect sessions”, Matti ( showed the “behind the scenes” aspects of the projects he showed in his previous one. In all of these, Quartz Composer was used.

The first project Matti showed was an upcoming medical-related project. An interesting range of gestures can be used to zoom and flip between images.

Then Matti showed the system behind his Animoitu Liike project (animated puppets for children, shown previously), which follows this path:
OSCeleton -> Max/MSP (reformatting OSC messages) -> Quartz Composer (receives OSC) -> Animata (animation)

Regarding the implementation of OpenNI, Matti recomended the following solutions:
Follow the “read-me” at GitHub/OpenNI:
Or follow the instructions here for the Tryplex toolkit:

Video about the Tryplex toolkit:

Finally, he showed one more project under development, the forest-related Laserkeilaus, allowing persons to be represented as trees.

6. Inspiration: Moullinex – Catalina music video

To wrap up the session, I showed the Moullinex – Catalina music video (Portuguese band and video directors), produced using Kinect for 3D capture. Thankfully, there were 3D glasses around, so we could watch the video in 3D.

(continue reading about the third Kinectic Jam Session)

This entry was posted in Courses / Projects, Kinect, Kinect Jam Sessions 2011. Bookmark the permalink.

One Response to Kinectic Jam Session 2

  1. Nuno Correia says:

    One additional tip from Palash:

    This is a fairly nice and detailed set of instructions to install OSCeleton. If you face problems with libusb, uninstall libusb and then reinstall with libusb +universal

Leave a Reply