This unique collaboration between Hakan Lidbo (SE), Tracy Redhead (AU/AT), Synthestruct (US), Trevor Brown (AU) and Jonathan Rutherford (AU) working at Ars Electronica Futurelab takes audience interaction to the next level. During this live improvised performance the audience will decide the composition of the music.
The audience will be jamming with performers by using one of the world’s largest midi controllers – 3 giant, bouncing cubes developed by Hakan Libdo and Per Olov Jernberg – and by using the Gesture-based Interactive Remixable Dancefloor (GIRD) gloves — an award-winning prototype developed by Tracy Redhead and Jonathan Rutherford. The audience will also drive the performance’s pulsing interactive and audioreactive projections, using “react();” — a dance visualizer designed by Synthestruct. The music has been composed and designed by Tracy Redhead and Trevor Brown who will also be part of the improvised performance. The audience becomes the soul of the performance highlighting the participatory nature of music and dance as an artform. The project uses Kinect and Gesture sensors, IRCAM Ri-ot sensor, Processing, Abelton Live, Max 7 and Max for Live in combination with real-time musicians.