Urban alphabets app – update from a work in progress

I’m fascinated by the city’s visual appearance: How different are different streets or areas in their graphical look?
Letters are ubiquitous in our environments. But the form, color, size and kind of letters are also very different whether one is in a shopping mall, a historical city center or on a highway. Only when being with pre-school kids we are aware that not everyone can read these letters. These kids actually do see letters as what they are: shapes. As soon as they learned reading just seeing shapes, colors, sizes and materials is impossible. The “Urban Alphabets” project aims to bring these aspects back to our perception.

As the process of postproduction has always been too big to make more urban alphabets I started to develop an iPhone app solving this problem during “multitouch interaction” course in December 2012. It enables to create urban alphabets on the go (using camera) but also allows using pictures from the photo library. You can also write your own text, which will be output using the letters you added to the alphabet.

As the first presentation has shown there are many possibilities to use the app also outside the urban context. Letters are everywhere.

As it is so much fun to play with the app I will definitely develop it further. There is the idea to geotag all images taken and use letters from a certain area around when tweeting, sending text-messages,…

Posted in Uncategorized | Tagged , , , , , , | Comments Off on Urban alphabets app – update from a work in progress

Demo Day Winter 2012 photos

Posted in Demo Day 2012 Winter | Tagged | Comments Off on Demo Day Winter 2012 photos

Two Projects: Witchcrafter and the V Performer

Two Projects: Witchcrafter and the V Performer

Two projects by Krisjanis Rijnieks will be presented during the Media Lab Winter Demo Day 2012 on December 20. Projects are made during the Physical Interaction Design course by Matti Niinimäki and Multitouch Interaction by Nuno Correia.

Both projects were made to explore the Cinder creative coding framework and test it’s possibilities regarding wireless and serial communication. Witchcrafter was an attempt to explore the XBee wireless device capabilities and the V Performer was done to explore multitouch on iOS. Both of these projects are controller prototypes that translate different human actions into visuals.

Witchcrafter

WitchcrafterWitchcrafter is a 3D user interface controller that was intended to look like a broomstick, but would be used in many more possible contexts. It is a stick with sensors: two accelerometers and a gyroscope. Sensors are connected to an Arduino board that transmits sensor measurements wirelessly to a computer via XBee radio link as JSON formatted values.

V Performer

V Performer is generative live visuals solution for iOS and Mac. The solution consists of two applications (for iOS and Mac OS) that have the same drawing algorithm. With the iOS controller application you can control the host application on a Mac via wireless network by using the OSC message format.

Very welcome!

Posted in Demo Day, Demo Day 2012 Winter | Tagged , , , , , , , , , , | Comments Off on Two Projects: Witchcrafter and the V Performer

From Forrest: Good news for my project, just made public

Art meets the open web: announcing the Mozilla Eyebeam Open(Art) Fellows

https://blog.mozilla.org/blog/2012/12/05/openart/

Art meets the open web
Dec 5 2012

Announcing the Mozilla Eyebeam Open(Art) Fellows

Stefan Hechenberger and Addie Wagenknecht, Toby Schachman and Forrest Oliphant

Today, Mozilla and the Eyebeam Art + Technology Center are pleased to announce the recipients of the first-ever Open(Art) Fellowships. Together, these creative technologists will be exploring the frontier of art and the open web as part of our new Open(Art) program.

Pushing the boundaries of creative code

Supported in part by an award from the National Endowment for the Arts, the Open(Art) initiative is all about supporting projects that facilitate artistic expression and learning on the open web, using code to enable cutting-edge art, media and hardware production.

Over the next six months, the fellows will create open source tools and works that enable creative production and open participation. They’ll document their progress online, seek to grow communities of artists, developers and users around their projects, and publish their resulting code under an open license.

And the fellows are…

The 2013 Open(Art) fellows are:

Forrest Oliphant: Meemoo

 

Meemoo brings the power of app development to everyone. It’s an HTML5 data flow programming environment with an emphasis on realtime audio-visual manipulation. Using an intuitive visual interface that lets users connect modules together using colorful “wires,” Meemoo lets anyone remix and build their own creative apps right in the browser.

“I often see kids playing with touch screen apps that only do what the developer designs it to do,” Forrest says. “I want to blur that line between developer and user, and allow more people to create different kinds of media.”

Posted in Alumni | Tagged , | Comments Off on From Forrest: Good news for my project, just made public

Kinected Stories: Little Red Riding Hood meets a hungry wolf

Kinected stories is an interactive fairy tale played with Kinect controller. There is some obstacles in the story and you need to help Little Red Riding Hood to see better and “look further” to get a clue how to progress. It’s an adaptation of classical Grimm’s story. The project was done in a company called Delicode. It took the whole summer 2012 and we were a team of four: a project manager, programmer/ animator, sound and UI/storyteller/graphic designer.

We also made a little book to test how a printed book and an interactive story can support each other and make the experience bigger. There’s even a secret Little Red Riding Hood disco embedded as a QR code to the book.

After the demo was finished, we’ve participated some competitions like Mindtrek Launchpad (second place), Think Ink (second place), Game Connection in Paris (in finals of “best game project”) and then some presentations in SyysGraph in SLUSH and in Blender conference in Amsterdam. Even though there has been some success the future is still open. Come and check us out at www.kinectedstories.com

And here’s a little promotional video:
Kinected Stories promo video

Posted in Demo Day, Demo Day 2012 Winter, Game Design | Tagged , , , , , | Comments Off on Kinected Stories: Little Red Riding Hood meets a hungry wolf

Needlepoint Robo

Needlepoint Robo is my embroidery project for e-embroidery workshop organized by Pixelversity, held at Pixelversity office at Helsinki Cable Factory on 17.-18.3.2012. The workshop explored the cossovers of traditional embroidery and electronics, combining crafting traditions with open design and DIWO (Do It With Others) attitude.

As the name tells, my embroidering technique of choice was needlepoint, pixel-like stitching on a double mesh canvas with thick wool tapestry yarn. I tried to find an idea where both the LED lights and the conductive thread circuits were an essential part of the design and not just attached on top of a random image. And here’s the result, a battery-powered small robot with green LED eyes and a red LED heart – kind of a love child of E.T. and Terminator:

Robo off

Robo with eyes on

Robo with eyes and heart on

Robo from the back

More information about the workshop and the process of creating Needlepoint Robo can be found at my website: http://ageingyoungrebel.fi/2012/05/needlepoint-robo/. I’m currently working on a pattern and instructions for the Robo, they will be available on my website soon.

Posted in Demo Day, Demo Day 2012 Spring, Interactive Electronics | Tagged , , , | Comments Off on Needlepoint Robo

Machine Head – Experimental Drum Machine

Here’s a video about my project Machine Head for Derek Holzer‘s workshop “Building interfaces for audiovisual performance”. It is an experimental rhythm machine that uses two solenoids to create mechanical grooves.

The solenoids and LED lights are controlled by an Arduino program that creates simple and differentiating rhythm sequences. Randomization and tempo parameters can be controlled by a single potentiometer knob. Machine Head has 4 mono jack outputs: “Kickdrum” microphone, 2 x headphone microphones (picking up the electromagnetic interference from the solenoids) and one contact microphone that is placed inside the head (picking up the hits and resonance of the glass). You can also use Machine Head as a solely visual device.

Posted in Interactive Electronics | Tagged , , , , | Comments Off on Machine Head – Experimental Drum Machine

Meemoo hackable web apps

A project that started in Interactive Cinema class as a web video remixer has abstracted itself over the past year into my thesis project. Meemoo is a modular flow-based visual programming environment that runs in modern web browsers.

Meemoo is looking for all kinds of collaboration: app/module ideas, module/framework design, ux, community design. Talk to me.

When you think of an “app,” do you think of something that you can open, hack, and change how it works? Meemoo wants to give you this freedom. If you can’t open it, you don’t own it. Meemoo is a framework that connects open-source modules, powered by any web technology. The way that the data flows from module to module is defined and visualized by colorful wires. If you can connect a video player to a TV, you can program a Meemoo app.

Project page: http://meemoo.org/

One use: Meemoo Live Animation

Meemoo Screen shot

Demo: http://meemoo.org/iframework/

Support for Meemoo comes from Media Lab Helsinki Learning Environments research group and Mozilla WebFWD.

Posted in Audio-Visual, Thesis | Tagged , , , , , , , , , , | Comments Off on Meemoo hackable web apps

Wave-o-Matic: Hand gesture based MIDI controller

Here’s a project I worked on during Matti Niinimäki’s Physical Interaction Design course.

Wave-o-Matic a stand-alone MIDI controller device built by using a single Arduino Mega board and sensors. The device has five ultrasonic sensors detecting user hand movements and gestures. An ultrasonic sensor searches for objects within its reach, calculates the object distance from the sensor and transmits this data to the Arduino board. This data is converted to MIDI notes and control values (CC) and sent to the MIDI OUT port of the device.

Detection field of an ultrasonic sensor is directed up towards the ceiling (in the default case that the device is standing and is not tilted). The range of the field is from few centimeters to 3 meters, but for more convenient use of the device the maximum range is limited to approximately one meter.

Each ultrasonic sensor has two modes (MIDI note mode and MIDI control value mode) and the modes can be selected with a toggle button below the sensors. A bright LED light below the sensor means that it is in MIDI note mode. There is also a single LCD text display on the device that shows what note and value is being sent from the sensors.

MIDI note mode

This mode allows user to play MIDI notes by moving an object (hand) in to the detection field of the ultrasonic sensor field. The pitch of the MIDI note varies by distance of your hand from the sensor. Lower distance results in a lower note being played and higher distance results in higher notes. The scale of the notes is natural by default but you can also set a mode (for example, raga or pentatonic minor) and play only notes that belong to that mode. You can also set a root key for the mode.

MIDI control value mode

This mode sets the sensors to send MIDI CC data values (0-127). The values depend on your hand distance from the sensor. When you remove your hand from the sensor detection field the last value is stored.

It was played live for the first time at the TIFF party in Tromso, Norway in January.There’s a video coming soon from that performance. I’m developing the project further and next I’ll take it with me to SXSW in Austin, Texas.

More stuff @ lab.kitkaliitto.com

Posted in Interactive Electronics | Tagged , , | Comments Off on Wave-o-Matic: Hand gesture based MIDI controller

Aerobical mechanic & organic orchestra

Here a link to video

This is  music intervention is based on interactive electronics as playable instruments taking input from body, to amplify his sound expression. Based on the concept of gymnastics sounds are triggered according to physical activities that modify biological pulse. Resulting in an open dialogue between digital and analog, physical and mental, heaven and hell: world will collide and the 8-bit doors of perception are opened, closed, opened, closed, closed, closed, opened, opened.

So interaction is based on following units: Heart rate detection by PPG or ECG.   Accelerometer embedded into a weight lifting object. Pressure sensor positioned under a matress. And a control box with different sensors, buttons, potentiometers to add some interesting sounds.Kinect as a visual sensor is intended to be used as a live instrument to run synthesized melodies, by angle calculations of users moving around space, even dancing some. Heart rate is used to control a solenoid to hit a real drum. This will give the beat for the performance controlled inconsciously by your body. In order to make change speed of heart beat, it should be needed to do some aerobical working on stage, weight lifting, running around the place or some squats. A control box will include a unpredictable amount of sensors and control components, that can be used to decide the scale, effects (delay, sampler, and grain synthesizer), maybe to play a melody as well.

Juan Duarte, Valtteri Wikström


Posted in Demo Day 2011 Winter | Tagged , | 1 Comment