
The development of the AURA MACHINE sound sculptures reflects my readings on alchemy and machine learning and is a multi-stage process of transmutation from ancient & techno symbolism through neural-latent space into the physical realm. The initial objective was to create a visual symbolic language to accompany the original 10 minute online piece for Future Music 3, these symbols and objects created in latent space would provide graphical metaphors for dimensionality and the sound object in latent space.
SYMBOLIC AI: I was researching Symbolic AI or GOFAI (good old fashioned AI), the leading AI field before the deep learning turn, where expert systems used symbols to represent known facts and rules in order to determine solutions. Thinking too about abstraction and the power of the image to represent meaning where there is complexity, I began to create a visual training dataset of techno-mystical symbols, these included ancient alchemical symbols, electrical circuit symbols and technical media drawings (objects in the National Science & Media Museum’s collections.) These images were taken from original sketches and designs, digitised and graphically developed as symbols in illustrator.
VISUAL TRAINING DATASET:



The 2D forms were imported into Blender, extruded and further designed to create 3D objects wrapped in virtual metal materials and exported as 3D digital objects which were then photographed for the dataset ready for training.

I used StyleGAN2 on Runway to train my model on the dataset, the training was really successful producing many interesting forms with clear depth and structure. The resulting digital materialities had such presence that they instantly sparked my imagination and I started to conceive of them not simply as digital forms for the online piece but as the basis for a further stage of transmutation, to create these as physical sound sculptures.
The next step was to look at each machine learning generated icon and consider which could make an interesting sculptural form visually but also sonically, isolate the object and repeat the above process to creatively refine and define these forms from 2D into 3D.
I have been using Blender as part of my toolkit which has helped me consider data in 3dimensions whilst learning about the mechanics of machine learning, enabling me to play with textures and digital materialities throughout the residency.

PROTOTYPING: Taking the 3D virtual objects I created in Blender, I used desktop prototyping tools to fabricate small physical versions using the PrusaMini 3D printer. This transmutational stage was vital, going from statistical approximations to generating test artefacts confirmed which of the icons could be conceived and for which material they may be suited.


#AURA_SCULPTURE_STEEL: The first AI generated sculpture is this resonant object, it is 60x65cm and is fabricated from steel. The object is suspended in the air via a custom built stand, this will enable me to play and activate the surfaces to extract the resonant sounds from the form. I will be using the piece for experiments, recordings and as part of my new live performance setup for AI – electronics and sound sculpture. The choice of steel for this piece represents the original materials recorded in my Concrete Training Dataset, I intend to create another dataset of sculptural sounds – a transmutational feedback loop of sound objects – statistical data – latent space explorations and physical sculptural AI. A solid object generated via neural networks.



Thank you to James at Flux Metal in London for the brilliant job at fabrication!