I REACT TO THINGS
OZAN TEZVARAN
INFO(AT)OZANTEZVARAN(DOT)COM


Gestural A/V,
Interaction Design,
November 2024,


The inspiration for this interaction was to examine the direct translation from visual to audio. My goal was not to create representations of colors and shapes but to focus on directly converting color into sound. This was achieved by translating visual signals into electricity and connecting them to an audio mixer.

Using the ODD module from LZX Industries(Huge thanks to Tina Frank!), I was able to isolate different color channels, each of which has Minijack outputs. 

While these outputs are typically used to send visual signals, I connected them directly to the audio mixer. This setup enabled a direct translation from color to sound, revealing the immediate interaction between visual data and audio output.

An example of this process can be seen in Imagine This in a Big Room, where I primarily worked with green and alpha channels.

Later, I expanded this exploration using a USB microscope. I created sketches in TouchDesigner by separating color channels from the microscope’s output. By extracting values from the RGB channels, I was able to modulate three oscillators in real time. This experiment resulted in a digital color-to-audio transformation, achieved in real time.

As a final step, I integrated MediaPipe to explore gestural control over the RGB data captured by the microscope. By linking hand movements to the modulation process, I aimed to introduce intentional changes in color. This allowed me to move beyond passively captured images and experiment gesture-driven input. I found that working with solid colors alone was limiting, so this test helped reveal how gesture-based interaction could offer a more dynamic and expressive layer to the color-to-audio transformation.

Sound input is converted into control voltage (CV) using the LXZ Vidiot, a patchable analog video instrument. This CV signal is patched to control shapes and colors. The green and blue channels of the video outputs are connected to a sound mixer to produce sound based on the shapes. What you hear is what you see.

RGB channels captured by the microscope are used to modulate oscillators.
Upper Screen: Desktop
Bottom Screen: Tangible World


The RGB data scanned by the microscope is controlled through gestures tracked with MediaPipe. Upper Screen: Tangible World
Bottom Screen: Desktop