Investigation into interactive visualization techniques of biological material in the context of performance.
Justyna Ausareny, Zachary Hershman, Amanda Lee
A responsive winter themed environment, intended for stage performance, featuring reactive DNA snowflake visuals affected by motion and sound. The visuals aim to reflect and merge advancements in the fields of technology, biology and theatre. Our focus was on creating a participatory experience that would include the public in real-time exploration and experimentation. Through this developed prototype we hope to introduce a new biological approach to stage technology that will evoke the feeling of intimacy, collectivity and belonging. Our goal is to explore the relation between stage, performances and performers and to challenge future tendencies in the creative industry.
We’ve created sound reactive and motion sensing geometric shapes resembling snowflakes and textured them with DNA, extracted in real time. We’ve also included a particle system and multiple grid shapes to magnify the effect in order to create more of an immersive atmosphere that resembles a winter scene.
The snowflake was programed in Max as a grid and then segmented into parts to facilitate manipulation and lower processing power. We then combined the matrice with a second one responsible for the sound detection and effects.
To visualize sound, we used an audio input with a metronome and converted the signal to work with the matrice object. To create a fading effect, we applied a temporal envelopper which would then connect to a matrix. To make a more interesting effect, we set up an offset on scaling and rotations of matrices. We also have modes that can be customized together along with the sensitivity of the effect. Once we did that we combined the object matrix with sound detection matrix to make a new gl geometry that also includes different drawing options and color filters. We found that customization aspects are very important in creating stage technologies since there can be many changing factors. The system needs to be flexible to accommodate real-time fixes and adjustments.
Snowflake movement was accomplished relatively easily thanks to the smooth implementation and operation of Synapse, the Kinect instrument, and the DaVinci Patch. Values collected from the DaVinci patch are received in our patch and then distributed.
Our final presentation featured a station for the DNA extraction experiment and microscopic video feed, a Kinect camera to capture the visitor’s position, and a ceiling projector to display the visual output. Sound frequency was being captured through the microphone.