In this work, we have developed a textile-based interactive surface fabricated through machine knitting technology. Our prototype explores intarsia, pocket patterning, and a collection of yarns to create a piano-pattern textile for expressive and virtuosic sonic interaction. We combined functional (conductive, thermochromic, and composite) yarns with non-functional (spandex and high-flex polyester) yarns to develop “KnittedKeyboard”, both with its physical properties and responsive sensing and display capabilities.
The sensing mechanism is based on capacitive and piezoresistive sensing. Every key act as an electrode and is sequentially charged and discharged. This creates an electromagnetic field that can be disrupted by hand’s approach and touch, enabling us to detect non-contact proxemic gesture such as hovering or waving on the air, contact touch, as well as to calculate velocity. The piezoresistive layer underneath can measure the force exerted on the keyboard. We map this pressure value as an aftertouch modulation after a certain delay. The color-changing, display mechanism defines the contact and non-contact mode of play. All of the sensor data is converted to musical instrument digital interface (MIDI) messages by a central microprocessor. They are then transferred to a computer via USB. Audio sequencing and generation software such as Ableton Live and Max/MSP map these MIDI messages to their corresponding channels, notes, and controls. As a second iteration of the "FabricKeyboard," the KnittedKeyboard demonstrates a new fabrication process of fabric-based interactive surfaces. It allows musical performers to experience multi-modal interaction with its discrete and continuous controls while exploring the seamless materiality of the electronic textile.