A brief video showing how sensor data is monitored by a custom designed patch programmed in Max/MSP. The patch is built by Daniel Overholt, Adam Pultz Melbye and Cumhur Erkut, using software engines designed by Jules Francoise and Zachary Seldess.

Apart from the MYO data, an Xbox Kinect records the movement of the audience and the music of Flamingo is recorded to a stereo track.

An excerpt of a text file describing a single arm’s movement in space, as well as muscle tension. For a 30 minute performance, each Myo armband generates¬†around 30 MB of text. Adding audio and Kinect data, the total data volume of a single session approaches 600 MB of information.

An algorithm designed by Chandrasekhar Ramakrishnan analyses the raw sensor and audio data. The algorithm incorporates machine listening and learning to sort the data and finally applies rules on how to translate it into a vector file.

Depending on the rules applied, very different results are created. The image here on the right presents different approaches to determining the overall shape of the sculptures. Individual variations from sculpture to sculpture reflect differences in performances, as well as the number of audiences members and their behavior.

After the data analysis is complete, a stl. file is created and fed to a 3D printer that starts printing the actual sculpture.

After 6-7 hours, the sculpture is ready and will become part of the accumulating exhibition.