Saturday, October 1

Interactive Cylindrical Sound Portal

In this paper, I describe an installation design, which is a combination of new media technology and conservative video synthesizer. The installation is an Activity-Centered Design(ACD), which doesn’t focus on the goals and preferences of users, but focusing on behavior surrouding particular system. The Interactive Cylindrical Sound Portal sets up both sound and graphical interaction between machines and human in a miniature closed-area. The project concept is an experiment design building new media and new experience about sound and graphical interaction to user. To construct this experimental system, I integrate Rutt/Etra composition patch which working with Quartz Composer application producing visual channel. The visual processing is based-on video synthesizer, which discovered in 1972 by Rutt/Etra analog synthesizer machine. In addition, Pure Data Extened patch responses to produce an 8-bits sound.

For the video and audio processing, this project partly uses Rutt/Etra Quartz composer patch to respond to a specific area of visual animation. Naturally, this QC patch works primarily with the web camera input channel, to mimic the original Rutt/Etra video synthesizer, nevertheless, it is not used as such for the Interactive Cylindrical Sound Portal project. In this project, the Rutt/Etra QC patch controls only a graphical line as a result of sound attracted by a condenser microphone. A patch in Pure Data Extended is another tool to manipulate midi sound in the installation area.

The intonation results as a consequence of the participant entering the area of web camera detection. For the amplitude pitch generation, the web camera detects a gesture from the image by calculating functional signal called a “blob”. A blob (alternatively known as a binary large object, basic large object, BLOB, or Blob) is a collection of binary data stored as a single entity in a database management system. Blobs are typically images, audio or other multimedia objects. As a result of Blobs, Pure Data Extended patch evaluates gesturing position, and subsequently announces sound calculated by the pix_blob function. Next, “osc~” and “dac~” function objects respond to make the sound in different amplitude positions. A condenser microphone and loudspeaker co-operate in creating a looped sound channel and graphic channel while the participant keeps moving their body or hand gestures. Once the participants have gone, Blob data return to default value, which is dependent on the brightness of the current environment around the experiment space. The sound which manipulated by the patch performing as attractive 8-bits sound. Thus, the participant who attends to installation area will be attracted with this charming conservative digital sound.