At the heart of the experience was the Kinect sensor, which tracked the participants’ movements, calculating collisions with the virtual balls in the pit. Simultaneously, the room’s integrated microphone captured ambient sounds, translating them into visual waveforms displayed on the screen via FFT processing. This created a multisensory environment that highlighted the relationship between sound and movement. Additionally, a ceiling-mounted webcam tracked people’s movements in the space, translating their X,Y positions into “water perturbations” on the screen, adding an extra layer of interactivity.