Paja

Icon

Rapid prototyping for music, art and design work

HugiPet

our petProtected: HugiPetMonday, November 12th, 2007HugiPets are a way to connect people. HugiPets are connected via network and if you hug yours your friends HugiPet reacts.We use Air v1.1 (Air sensor) to measure pressure and and send the signal to other HugiPet via network. Hugging light the other HugiPets led and shows how hard you are pressing.Workgroup: Elise Liikala, Mikko Toivonen, Kimmo Karvinen ja Juho Jouhtimäki.November 13th, 2007 at 4:20 pmPigs talk.HugiPet project went on today when we operated with the pets we bought. It was time for surgery. There was a sound when squeezing the warthog and we wanted to know how it is done. We found a little bag filled with air and a pipe. Very simple mechanic using airpressure. The other pig will be filled later with some kind of pressure detecter too + the sensor. We are not sure at this point if we are using the original soundmechanics from warthog in some way also.November 14th, 2007 at 11:18 pmToday the hugipet project was experiencing with sensors. But what we found out was that there was not that kind of sensor available in our course material which we would have needed. The idea of hugging a pet and using air sensor was changed to motion based sensor. In the end of the day we had created a new brilliant ideaThe idea:To make a toy which will send information to computer and to another pet when moved. To find out more specific use of this toy, you will have to wait until Friday So exciting…November 15th, 2007 The pig-animation was drawn and created today. Also to wires were soldered and code tested. Pigie’s tummy got new filling. It was time for testing.Pig instructionsNovember 16th, 2007 The coding project of HugiPet still needed some effort today. Juho copep with it right on time before our presentation. Our non-sonic baby alarm was well and funtioning!This is how the HugiPet works:The child has a toy next to her/him in bed which recognizes movement in x- and y-axis. When the child wakes up and starts to move the toy, there will be signals sent to computer. On computer there is an icon which shows a simple animation of a sleeping piggie. When child wakes up, the animation icon will visually change and the piggie wakes up (+ makes an oink oink sound). If the toy is not moving for a while, the piggie falls asleep again.This application works better then the traditional baby crying alarm in cases wherea) a parent is working for example in a place where there is a lot of other sounds so one may not hear the alarmb) it gives an opportunity to listen for example music when working with one’s laptop while the child is sleeping on balconyc) It can be a very helpfull device for deaf people who can’t hear their baby cryingThe next phase would be to make another toy or an object which reacts to the movement of piggie. In that case you wouldn’t have to be looking at the computer all the time.

Digital painting control

Project and copyright by: Kimmo Karvinen, Juho Jouhtimäki and Mikko Toivonen.

Idea is to enhance digital painting frame (laptop framed and turned to wall mounted slideshow) with a feature that makes it possible for user to swap images by moving hand left or right in front of the frame.

We use two Ultrasonic Sensors to detect which way users hand is moving. The actual slide show will be made with flash.

Edit. We didn’t get the flash to work with serial data so now project works with director with embed Flash.

More info: kimmo.karvinen@taik.fi

Prototype 1

Furry Modulator

…………………………

-Playing on air

A Spatial Instrument
-a variation of the famous ‘Theremin’ by the Russian inventor Léon Theremin (1919)

Inputs
-distance
-sectors in the visual field
-ripple gestures

Processing
-movement through the sectors and steps builds up a sequence
-making a ripple gesture triggers a chain reaction of samples in the stack
-also a single sample can be triggered
-distance affects the volume and pitch of the ambient background tone
-a combination of distance and spatial grid selects a sample from the pool
-temporal effects on volume and filtering for the samples in a chain reaction

Outputs
– MIDI events into Reason (possibly)
– samples loaded inside PD
– stereo sound through loudspeakers

Interface
Bodily movement in space for a single person using 2 hands and ripple gestures controlling speed, pitch and the triggering of (stacked) sample sequences.

Read the rest of this entry »

gimme sugar

picture-1.png_Demo day1

Idea: a gesture controlled sugar dispenser.

Depending on the angle of tilting an object, shaped like a sugar shaker, a bowl placed elsewhere is opening and dispensing sugar. The more you shake/tilt the object the more sugar will be dispensed. This could be a fun way to sweeten a beverage or represent how much attention in form of sugar one gives to a significant other.

Read the rest of this entry »

Mood Shoes

A prototype by Markku Ruotsalainen, Jenna Sutela and David SzauderMood Shoes are sonic wearables for super walking. They enable a ubiquitous (mobile) way to experience the environment in a manner of choice. Wearing Mood Shoes, one can decide whether to walk on thin ice, on the beach, in a puddle, in snow, on the moon or on the foil of a big drum regardless of the nature of their actual location/environment. The sense of ground varies from grains of sand to splashes of water expressed through sound samples in headphones.first stepThe first step‘ {$STAMP BS2}’ {$PBASIC 2.5}pLED PIN 12pLED PIN 15pSens PIN 0′ ====[ Variable]=============================wBuf VAR WORD’====[Initialization]============================DEBUG CLS, “start”, CR’====[Main]================================xMain:DOPULSIN pSens, 1, wBufDEBUG DEC wBuf, CRIF wBuf > 200 THENPULSOUT pLED, 1PULSOUT pLED, 100ELSEPULSOUT pLED, 0PULSOUT pLED, 1000ENDIFLOOPENDdevices.jpgSwitching from BASIC Stamp to Arduinoarduino_pd.jpgArduino and PureDatamood_shoes_assembling.jpgAssemblingmood_shoes1.jpgAn early adaptormood_shoes_sketch_2-copy.jpgSketchingProject outlineBuildingEquipment- Arduino board- 2 touch sensors- 1 light sensor- Bluetooth- Pure Data- Sound samples- Shoes, belt- Headphones- FM receiver and transmitter (see http://anarchy.k2.tku.ac.jp/)ProgrammingTasks:1) Realizing a (Bluetooth) connection between Arduino board and PureData- MIDI from BS to PD2) Making the touch and light sensors work with Arduino and shoes3) Connecting the FM receiver and transmitter and the headphones to the systemInterfacingTasks:1) Collecting sound samples2) Choosing atmosphere samples to work with the light sensor3) Placing the touch sensors in shoes and attaching the Arduino board to a beltmood_shoes_proto.jpgTry me!

Protected: HugiPet

This content is password protected. To view it please enter your password below:

The Sixth Sense

Project and copyright by Kimmo Karvinen and Mikko Toivonen

the sixth sense is a device that warns user when someone or something is approach from the back (range is set to 2,5 meters). It’s attached to belt so that the Ultrasonic Sensor is on the other side and the vibration motor is near the skin.

More info: kimmo.karvinen@taik.fi

Stalk2

Demo1

demo2

Hands on work started today

Alejandro

Alejandro works with an old keyboard to build a sensor.
Alejandro II first steps

Alejandro II experiments with Arduino and motors.

The sensor is a microphone…

Continuous Motion, Discrete Signal

The concept involves defining discrete regions of space between 2
‘pads’ (preferably small patches that can be attached to the body). Data is controlled in intervals by the proximity between 2 objects increasing and decreasing.

For instance, instead of the continuous stream of data from a source
such as a Theramin, there would be defined regions of space which
would trigger a discrete sequence of a defined (musical) scale.

The user would not be touching the pads but the signals would be triggered by the pads moving closer to one another or farther apart. One scenario is that a person would have one attached to their hand and one to the top of their foot – and there would be a range of intervals that would be triggered between the closeness of the hand and foot.

Working Plan / Technical Solutions

The proximity sensor, PING Ultrasonic, communicates with BASIC Stamp and MAX/MSP via Arduino Bluetooth.

A program defines the distinct regions between the pads.

Mat Synthesiser

A “foot controller” of audio synthesis. Sort of “track pad” sensible to the position and pressure to be played by the foots.

Technical solutions : Big Pressure sensors (5 or 6) can be placed under a “mat” flat surface (20cm, 20cm) to give information in an axis system (x y z?) for MAXMSP or PD.

Soup sounds

Preliminary plan:

In short the idea is to use chemical reactions as control data. Here the key question is: what kind of parameters can one measure in a liquid? A few that have come to my mind: salt level (conductivity?), temperature etc. I’m sure there are others…

This control data would be mapped to generate and/or modify sound. The idea is to create a performative composition with little direct user intervention, although it might be necessary to work as a cook and mix in different ingredients during the performance and change the temperature of the substance.

Soundtrack of our life

The working title title for my project is “soundtrack of our life”. The idea for this project derives for one of my earlier works. A sound installation for an mp3 player: where the viewer/ listener was offered an ipod set to play a huge amount of various samples on random. Changing the experience of the surrounding by changing the sound scape. This time i would like it to be possible for the sound to change because of the listeners actions/movements.

Sensor input would change the sound. Ideally the user should be able to carry the set around.

Technical solutions are something where i only have some vague ideas at the moment. I do think that i need a sound source, sensors and a micro-controller that communicates with them both.

Robotic Orchestra

As project for the physival computing course I´d like to realize a “robotic orchestra”. The basic idea is to control several motors that are connected to the computer via an Arduino board. The actions of these motors is controlled creating a Pure Data patch. The motors serve like “real world players”. They can play and “bang” respectively on all kinds of objects according to the control data they receive from the host software. Digital bits and bytes are transferred into real motions.
I´ve seen a couple of people using solenoid motors for similar purposes, so this might be one direction to start with.

Technical requirements:

1.) arduino board

2.) motors (solenoid?)

3.) pure data

I Can

The working title of my project is I Can. This might be regarded as the can equivalent of I, Claudius, in the sense that it has something to do with I-dentity. The idea is to make a musical instrument out of a beer can and discover what its potential for sonic performance might be (er). If a performance is forthcoming, it will surely have something to do with the metaphor of ease vs. effort, i.e. the I Can constrasting with the I Cannot. In any case, the can will have to have some life of its own, otherwise it will not be its own I.

I would like to put a wireless Arduino inside, along with a battery box, some sensors and a servo motor to make the beer can shake (rhythmically?)The Instrument on command from the remote computer. What sensors CAN I put inside? Perhaps touch, velocity, orientation, a microswitch to detect whether the ring cap is in place or lifted.

What sounds will I control with it? Maybe Offenbach’s Can-Can. OK, as yet I have no idea.

The programming of the beer belly to go with the instrument is another long term project. More on that when I know how many beers I had to drink before I found the ideal can for the prototype.

Nickname: persona

Short description:

Wondering today about how new ways of communication, allowed by our present time technology, affect human identity transformation and construction in the net and how they can cause physical feeling and human energy perception to be lost or experienced in a new manner, nickname: persona project is an interactive audiovisual installation where people is obligated to act to make the piece possible and wants to make them wonder about the kind of relationship established between them and how they are showing their identity to the others;
The work would consist of taking data from visitors’ bodies by sensors, which would manipulate the sound and taking snapshots of their faces to build a new face with all of them. Sound and image would be streamed to a website.

Development (working plan):

During the following three month:
1. Make some tests with sensors to experience the kind of data that we can receive from the human body and decide the kind of relation that it can be established between them and the sound of the installation.
2. Make some test about the disposition of the sensors and how the visitor must use them.
3. Decide the shape of the presentation of this part of the project and also the whole project output.

Technical solutions:

Sensors (undefined)
Arduino
Max/MSP

Questions:

Are there some sensors that I could test before buying them? I’m interested in blood pressure, body temperature, sweat, heat beating, nervous system… any other ideas?

Image controller(Arduino+Director)

There are several sets of images. The value from touchsensor decides which set of images play in Director. With potentiometer, user can control the speed of rolling images.

code mov

Ji Hyun Hong

Using Arduino and Memsic 2125 accelerator with C++ and Open Dynamics Engine

I made a simple prototype about using a Memsic 2125 dual-axis accelerator with C++ and an open-source physics engine called Open Dynamics Engine (ODE). The result was something that resembles the experience which you get while playing games on the motion sensing controller of the Wii console (although I haven’t tried that myself).

Throw some boxes with an accelerator sensor and ODE

Read the rest of this entry »

Arduina and a light sensor

Doll house and cleanable floorArduino connected to light sensor

See video:

This is an educational work where children (and adults) can learn how to clean a floor with a vacuum cleaner. Inside the vacuum cleaner is a light sensor connected to Arduino-microcontroller, which is connected to Processing-software. The light sensor recognized white spots from the floor (made out of a computer display) and cleans it if one is fast enough, if not the spot get stuck.

Made by Meeri Mäkäräinen ja Tuomas Laitinen

Source codes:

Source codes

Simple quiz

Basic idea is to ask quenstions from usen and whether the answer is right – green led will light up or if the answers wrong – red led will light up. It also counts the amount of right answers. It’s easy to continue quiz with further questions since right now there’s only one. Setup1Setup2Setup3

Pii Paappanen

Love Match – Touchsensor and Flash

Lovematch

Description:
This program calculates the love-potential between two people 🙂
Two touchsensors are connected to Arduino. When they are touched, the running program measures the values of the pressure-intensity during 5sec. The maximun and minimum of the measured values are saved to variables. Each second during the 5sec, the program adds the (Max – min) to a TotalSum and resets max and min. After the 5th sec. a function drawHeart is called which creates a heart on the screen calculating the size according to the TotalSum values we obtained. Got it? 🙂

Material/Software:
Arduino microcontroller & IDE
Processing IDE
Flash
2 x Flexiforce touchsensor

Code/Files:
Arduino/Processor/Flash-files
(flash server connection and processing code by Aleksi Hyvönen and Viljo Malmberg)

Video Lovematch

Melanie Wendland, Jan Wolski

Social links powered by Ecreative Internet Marketing