Directed by Ronit Izraeli
INTERACTION LAB VISUAL COMMUNICATION UNIVERSITY OF HAIFA SCHOOL OF DESIGN
04_SHACHAR+LIR
INTERACTION LAB VISUAL COMMUNICATION UNIVERSITY OF HAIFA SCHOOL OF DESIGN
Our project investigates the visual embodiment of sound, asking: What does a sound look like, and how does it feel? We explored how different individuals visually perceive auditory stimuli, seeking to establish a formal system that translates sound into shape. By analyzing the relationship between frequency and form, we developed a set of visual "rules"—a systematic language designed to bridge the gap between what we hear and what we see.




02
Series 1 – Resonance in 10 Frames
This project explores the sense of hearing through a "zoom-in/zoom-out" progression across 10 frames. Using synesthesia as our guide, we translated auditory frequencies into a visual narrative, contrasting sharp, jarring sounds with deep, resonant bass. The sequence illustrates the fluid transition between hearing and seeing, turning abstract sound waves into a tangible visual experience.






_gif.gif)
Aural Topographies: Participant Responses
We conducted a study where participants were asked to draw their immediate visual response to various sounds. These intuitive sketches capture how different individuals translate auditory stimuli into form and texture. This research allowed us to identify shared patterns and establish the visual "rules" for our project.
















_gif.gif)


03
Following our research, we designed two posters that explore the visual identity of sound. The first captures the raw, human perspective by layering sketches from our study participants. The second applies these findings to a digital system, translating technical audio data into precise visual forms.
04
Building on our research, we developed an interactive interface that translates sound into a celestial visual language. By establishing a "system of rules" based on planetary forms and cosmic space, the interface reacts to audio input in real-time through a zoom-in/zoom-out progression. This system transforms abstract sound waves into a navigable universe, where every frequency dictates the movement and scale of a unique cosmic body.

final work
The Pixel Archive: Responsive Audio-Visual Systems In the final stage, we developed a sophisticated system that analyzes real-time audio data to generate a dynamic pixel-based archive.
By mapping specific acoustic properties to pixel behaviors,
we created a digital feedback loop where sound dictates form, color, and movement. This project serves as a precise visual translation of sound, capturing the emotional and technical nuances of audio through a generative, grid-based language.





How can a generative pixel-based system create a precise visual representation of the emotional and technical nuances of sound?
