Following the AURA MACHINE ten minute online piece I created for the Future Music #3 event at RNCM in June ’21, I was keen to expand and develop the piece as a live set, bringing real time interaction with both sonic and visual materials. I’d been using Touchdesigner as a beginner and knew of its fantastic potential for live music specifically audio reactive visuals, I wanted to take the symbolic graphical language I used for the short AV piece and expand this so I started collaborating with digital artist and Testcard curator Sean Clarke who took my visual material and developed this into a live audio reactive artwork that we can perform together live.
As part of our collaboration we had access to Seesaw in Manchester city centre, an old textile warehouse where we could test ideas and make noise. Following lockdown it was great to playback the machine auras generated from the piece in a former mill space … loud. The visual narrative follows the sonic journey through the initial training of concrete materials through to a transmutation state and ending with the purely generated AI. The piece started with mapping onto a steel sculpture, zooming out to a wide angle shot of myself with the music set up playing in the warehouse. This scene slowly becomes interrupted until the AURA state of transmutation takes over. We enter the final state via a ‘machine wind’ representing latent space and the AURA MACHINE symbols start to appear and converge representing machine learning feature extraction. Finally one of the AURA MACHINE icons takes over the screen, a point cloud system reacting to the sounds of the machine learning materials, before coming back to ‘reality’ and the steel sculpture.
Our first live test was for Didsbury Arts Festival, Sept 2021
Thanks to the Manchester Independents for supporting me in bringing the live set to life. The funding, space and support enabled me to work with Sean Clarke, test ideas and experiment.