Interactive spatialization with SATIE and EiS, works with D. Andrew Stewart

D. Andrew Stewart, composer/performer/improviser of new music using innovative musical instruments, visited Metalab for a short collaboration

From November 9 to November 14, 2018, D. Andrew Stewart, composer/performer/improviser of new music using innovative musical instruments, visited the Metalab for a short collaboration. The goal of the collaboration was to familiarize himself with some of the Metalab tools for immersive performance, and to sketch some possibilities for future collaboration and residency. It was also an opportunity for the Metalab to test the experimentation and production workflow using in-house tools, including the haptic floor prototype.

This work was a first opportunity to connect an artist with the combination of EiS, SATIE and the haptic floor prototype. In spite of the very short time available for the residency, we were able to bootstrap Andrew with the in-house technology and experiment with a few possibilities of controlling spatial sound through the wireless Karlax controller. In a short amount of time, we were able to set up the system to allow Andrew to play with his own tested setup (Karlax interfacing with Max and playing Omnisphere sampler via LogicPro) by 8 outputs from his audio interface to 8 inputs on ours, and spatializing the sound inputs directly. We explored the control of those sources via direct communication with SATIE and via EiS, where we built a very simple scene with a few 3D primitives which were controlled by the data obtained from the Karlax’s gyroscope and accelerometers. The work was cut short, unfortunately, but it was out of our control; the only documentation available is from the second day of the residency.

We were able to spend some time discussing and reflecting on the use of the haptic floor in a live performance situation and its integration with the spatialization system. Andrew was able to come up with some sound articulations that predictably moved the haptic floor’s nodes. Takeaways from these sessions are the following:

  • The process of bootstrapping an artist to the above setup is quick but being familiar with OSC and ability to make appropriate mapping helps.
  • The haptic floor was interesting to use, even for the artist himself. Receiving haptic feedback about what he was doing was a welcome surprise.
  • The direct connection between a live instrument player and someone experiencing the haptic floor is very effective.
  • We need to address the latency between a live instrument player and the haptic floor.

Partager l'article
Copié dans le presse-papier