Spatial Tracks: Regaining Sound’s Locality

Digital Ecologies Studio
Tutor:Christopher Leung, Seda Zirek
Msc Adaptive Architecture and Computation 2014-15,
The Bartlett School of Architecture, UCL
Programs used: Processing, PureData, Grasshopper 3D, Scorpion, Firefly, Kinect.

When a sound wave bounces between two or more surfaces standing waves emerge. The locations of the nodes and anti-nodes of the standing waves are stationary and can be predicted and experienced physically. The aim of this project was to exploit the characteristics of this phenomenon in order to create localized acoustic experiences. The first step was to create simple standing waves by emitting the resonant frequency of a room from a fixed sound source. Additionally, a second sound source was placed, with a fixed location which emitted a frequency similar to the resonant one in order to create beating. This pulse could be perceived mainly at the locations of the anti-nodes of the original standing wave. This gave way to the next step which was the creation of an algorithm that would take as input the tempo and note values of a song and give as output the specific frequency that would create the desired beating when coupled with the resonant one. With this algorithm we were able to create highly localized rhythmic tracks that could only be experienced at specific locations within the room. The final step was to mount the second sound source on a UR10 robotic arm that would interact with the user. By having the second sound source moving we were able to invert the previous process of experiencing the locality of sound. One would no longer have to move around in the room in order to feel the track; instead the track would pass through him, then leave and return again later, while he/she is standing still.

Leave a comment