The performance is centered around a textile interface that uses 3D positional tracking to control sound and motion graphics. Through dynamic video mapping, the projected visuals are able to appear at the exact position where the gestures are performed.
 
 
 

 

YEAR:  2014

COLLABORATORS:  N/A

CATEGORY: HCI, NIME, Physical Computing, Intermedia Performance, Textile Interface, Positional Tracking, Interactive Projection Mapping, Creative Coding, Generative Art, Sound

 

TOOLS:  Arduino, Max/Msp, Processing, Electronics, Ableton Live, XBee

 

 

 

Performance at ICLI 2014, Lisbon, Portugal, Nov 22, 2014

 

 

 

  

 

 

The design of the interface aims at intuitive interaction. Its conductive fabric provides a set of twenty-four capacitive touch sensors. Conductive threads are used to connect the sensors to the main circuit board. A motion-tracking device collects the quaternion data. The raw data is collected through a micro-controller, Arduino, and then translated into positional information. By moving the hands along the interface body while pulling the fabric in different directions, the performer manipulates sound effects and visuals in Max, Ableton Live, and Processing. The visuals are mapped to where the gestures are performed on the interface body.

 

 

 

 

 

 

 


 

 

Selected and performed at INTER-FACE : International Conference on Live Interfaces 2014, Lisbon, Portugal, November 22, 2014

 


 

 

Selected and performed at NordiCHI 2014 – 8th Nordic Conference on Human-Computer Interaction, Helsinki, Finland, October 27, 2014

 


 

Exhibited at World Maker Faire New York, New York Hall of Science, Queens, NY, September 20-21, 2014 with Harvestworks International Art Collective.

 


 

Performed at Make Music New York Festival and Dark Circuits Festival, The Firehouse Space, June 21, 2014 with Harvestworks International Art Collective.