(continued from http://mlab.taik.fi/mediacode/archives/898)
On 7/3/2011, we had the second “Kinect Jam Session” at Media Lab Helsinki.
(continued from http://mlab.taik.fi/mediacode/archives/898)
On 7/3/2011, we had the second “Kinect Jam Session” at Media Lab Helsinki.
On 28/2/2011, we had the first “Kinectic Jam Session” at Media Lab Helsinki.
These sessions aim to be very informal and hands-on, based on showing and doing things together.
Objectives:
– Exchange experiences regarding development with Kinect or related technologies
– Share tips on how to get started with Kinect-related development
The target for these jams are either people who have been working with Kinect, or people interested in starting out.
One of the main points of interest is to map out the implementation of Kinect in different development environments (openFrameworks, Processing, Flash, Unity, PureData, Max/MSP, Cinder, QuartzComposer, etc), and reach recommendations regarding different approaches (TuioKinect or OpenNI, for example).
In the first session, on 28/2/2011, there were 2 presentations, by Matti Niinimäki and Ferhat Sen.
I am currently teaching Flash at MediaBizLab, Aalto Media Factory.
The objective of this component of the course is to enable participants to create prototypes and interactive mock-ups.
Program and resources
28/1 – Session 1: Basic concepts and managing information flow
1. Text and text fields [tutorial]
2. Variables [tutorial]
3. Functions [tutorial]
4. Arrays and loops [tutorial]
5. Drawing by code [tutorial]
6. Buttons [tutorial]
7. Conditionals and properties [tutorial]
8. Exercise – basic drawing tools
11/2 – Session 2: Graphic design and user interface
1. Continuous events [tutorial]
2. UI components [tutorial by Adobe]
3. Loading images [tutorial]
4. Loading text [tutorial]
5. XML [tutorial]
6. Publishing a project [tutorial]
7. Using the timeline [tutorial by Adobe]
8. Exercise – creating a small mockup project
Additional links on prototyping with Flash:
More links, resources and tips for further explorations can be found here.
Via BBC News:
Microsoft’s Kinect controller has been hacked only a few days after it officially went on sale. Code to control the motion-capture device has been produced that allows it to be used with a PC rather than the Xbox game console. Those behind the hack are keen to use the device in schools, art projects and to aid human-robot interaction.
This opens many possibilities for using Kinect’s advanced motion capture for exploratory / artistic projects.
The hack follows a competition launched by Adafruit.
About Kinect, from Wikipedia:
Kinect for Xbox 360, or simply Kinect (originally known by the code name Project Natal), is a “controller-free gaming and entertainment experience” by Microsoft for the Xbox 360 video game platform. (…)
Kinect is based on software technology developed internally by Microsoft and range camera technology by Israeli developer PrimeSense, which interprets 3D scene information from a continuously-projected infrared pattern.
Updates:
Continue reading
ofxQuartzComposition is an addon for openFrameworks to load, control and render Quartz Compositions (.qtz files) inside openFrameworks.
In the video above:
two quartzcompositions (rotating cube + grid of morphing shapes) being loaded and mixed with openframeworks graphics in an openframeworks app. The slider on the bottom adjusts the width of the rectangle drawn by openframeworks (ofRect), the 6 sliders on the floating panel send their values directly to the rotating cube composition while it’s running in openframeworks
from http://www.vimeo.com/16346790, by Memo Akten
The demo also uses (and requires) ofxCocoa. Links:
http://github.com/memo/msalibs/tree/master/ofxQuartzComposition/
http://github.com/memo/msalibs/tree/master/ofxCocoa/
Memo Akten also notes:
How is this different to Vades ofxQCPlugin (http://code.google.com/p/ofxqcplugin/​) ?
ofxQuartzComposition is the opposite of ofxQCPlugin. ofxQCPlugin allows you to build your openframeworks application as a QCPlugin to run inside QC.
ofxQuartzComposition allows you to run and control your Quartz Composition (.qtz) inside an openframeworks application.
Adobe has recently demoed a tool to convert a Flash file to HTML5.
Quoting Mashable:
(Flash) animations or interactions will now be viewable on all kinds of devices — including the iPhone and iPad. This is where we think that this HTML5 conversion tool has real possibilities. It’s one thing to be able to convert a movie or animation — that’s impressive — it’s the resources within those Flash files, however, that are potentially more useful.
Today I’ll be making a presentation at the Designing Interaction with Electronics course.
Theme: Prototyping in new media art projects.
Topics:
I will illustrate the topics with relevant situations from my own projects as Video Jack, particularly our most recent project AV Clash, its predecessor AVOL, and early prototype iAVo Gamma.
I will also assist in the development of student projects made with Flash.
This week we will be studying Flash and the basics of ActionScript.
Step-by-step tutorials for all the exercises (with one exception) and respective source/final files are available in this website I’ve recently completed: http://multimediadev.wordpress.com/
Themes and topics:
19/10 – Basic concepts and managing information flow
20/10 – Display objects and events
21/10 – Drawing by code, components and creating classes
Update – we did not go through exercise 1., although the tutorial text is still relevant. Instead, we did an exercise on drag and drop and collisions, fusing these two tutorials:
22/10 – Loading external elements
This post was originally posted at the Software Studies for Media Designers blog:
http://softwarestudies.mlog.taik.fi/
From Vimeo:
In this installation YesYesNo teamed up with The Church, Inside Out Productions and Electric Canvas to turn the Auckland Ferry Building into an interactive playground. Our job was to create an installation that would go beyond merely projection on buildings and allow viewers to become performers, by taking their body movements and amplifying them 5 stories tall.
We used 3 different types of interaction – body interaction on the two stages, hand interaction above a light table, and phone interaction with the tracking of waving phones. There were 6 scenes, cycled every hour for the public.
From http://www.eyewriter.org/:
The EyeWriter project is an ongoing collaborative research effort to empower people who are suffering from ALS with creative technologies. It is a low-cost eye-tracking apparatus & custom software that allows graffiti writers and artists with paralysis resulting from Amyotrophic lateral sclerosis to draw using only their eyes.
Members of Free Art and Technology (FAT), OpenFrameworks, the Graffiti Research Lab, and The Ebeling Group communities have teamed-up with a legendary LA graffiti writer, publisher and activist, named TEMPTONE. Tempt1 was diagnosed with ALS in 2003, a disease which has left him almost completely physically paralyzed… except for his eyes. This international team is working together to create a low-cost, open source eye-tracking system that will allow ALS patients to draw using just their eyes. The long-term goal is to create a professional/social network of software developers, hardware hackers, urban projection artists and ALS patients from around the world who are using local materials and open source research to creatively connect and make eye art.