It is fun to play with kites in Helsinki since there is a good wind current during most part of the year. For this course it was considered to add a value for kites with electronic features, in other words, to produce a sonification of kite movement. Thus the kite works as an interface to generate data to produce sound on real time, an instrument for musical or sound art performance which is controlled manually and modified randomly by wind conditions.
The interface has a wireless system that measure the speed and rotation of the kite, thus this data is transformed into sound on real time. Sound modules are designed upon on the principles of composition with pulsars, developed by Curtis Roads, and the micro sound studies of Alberto de Campo. For future development is considered to include a dynamic led light system and increase the number of simultaneous kites. Here is some pictures of the process of creation of the kite, and electronic devices used for the project.
You can view more test videos and documentation of Pulsar kites here:
Finally this project was presented in Voda Helsinki a visual arts, literature and music event in Kalasataman Aukio, on sunday 19th august 2012.
This is a project update for last years Designing interactions course, this could also be seen as the final report for the project for the course, even if the project in itself is still a work in progress. This project is called “Pingispöytä” and it is made by Pasi Rauhala and Niklas Kullström from the Photography department.
So What is “Pingispöytä”? The concept started from the idea of making a electronically controlled “mechanical” ball bouncer. Several different approaches arose in the beginning, with different alternatives for propulsion and ball types. In the end we decided to use solenoids and ping pong balls for the project. The idea being that a solenoid would bounce up a ping pong ball into the air at a specific defined time.
Fretboard is based on linear potentiometer. You can get exact position of where you press. Works amazingly robust. Strings are laser-phototransistor couple. As for sound output, now we are using soundfont and sensor data is randomly mapped to soundfont notes. Thus, mapping is again not clear Photos and video available.
Drum has a circular body and a drum skin. Two piezo’s are attached to the drum skin. Those two piezo’s are linked to two drum notes. The tangible interface is there. Technology there. But mapping is not so exciting yet. Two notes doesnt sound nice. Needs some work.
Our Reed Flute (Ney) is on its way. Photo-resistors and a small mic are attached to a backbone which will then be inserted into a hollow tube.
Mic sends audio signal thru line-in of the computer and processed in PureData. The audio is filtered out only to take certain frequencies which correspond to blowing. So it hopefully will not sense when you talk but when you blow. The amplitude of the blow is also processed and the more you blow the louder the sound. Working smoothly.
Photoresistors send analog data. Taken via Arduino send to PureData. Working great. Quite responsive.
Problem is now the mapping strategy. How can we map these processed inputs to the “song”. We were using soundfont but I guess we will map it directly to loops and control the loops with photoresistors. Mapping is still unclear.