INFO(AT)OZANTEZVARAN(DOT)COM
Gestural A/V,
Interaction Design,
November 2024,
Using the ODD module from LZX Industries(Huge thanks to Tina Frank!), I was able to isolate different color channels, each of which has Minijack outputs.
While these outputs are typically used to send visual signals, I connected them directly to the audio mixer. This setup enabled a direct translation from color to sound, revealing the immediate interaction between visual data and audio output.
An example of this process can be seen in Imagine This in a Big Room, where I primarily worked with green and alpha channels.
Later, I expanded this exploration using a USB microscope. I created sketches in TouchDesigner by separating color channels from the microscope’s output. By extracting values from the RGB channels, I was able to modulate three oscillators in real time. This experiment resulted in a digital color-to-audio transformation, achieved in real time.
As a final step, I integrated MediaPipe to explore gestural control over the RGB data captured by the microscope. By linking hand movements to the modulation process, I aimed to introduce intentional changes in color. This allowed me to move beyond passively captured images and experiment gesture-driven input. I found that working with solid colors alone was limiting, so this test helped reveal how gesture-based interaction could offer a more dynamic and expressive layer to the color-to-audio transformation.
Upper Screen: Desktop
Bottom Screen: Tangible World
Bottom Screen: Desktop