The base animations were created in After Effects and the sounds were made in Ableton with the help of the SynthGPT plugin. Then, everything was combined and mapped in Touch Designer, using optical flow and blob tracking TOPs. By using only these two operators, we were able to accurately track and map the graphics using only a simple camera input. The sounds and the visuals get triggered with a 1.5-4 second sustain every time there is movement detected over a specific plate. What is more, by reading the X and Y coordinates of the moving pixels over the plate, we were also able to make the animations on the screen follow the hand gestures of the user.