Project

3DKnITS: Three-dimensional Knitted Intelligent Textile Sensor

Copyright

Irmandy Wicaksono/MIT Media Lab

Irmandy Wicaksono/MIT Media Lab

Three-dimensional Digital Knitting of Intelligent Textile Sensor for Activity Recognition and Biomechanical Monitoring

We present an approach to develop seamless and scalable piezo-resistive matrix-based intelligent textile using digital flat-bed and circular knitting machines. By combining and customizing functional and common yarns, we can design the aesthetics and architecture and engineer both the electrical and mechanical properties of a sensing textile. We propose a method to shape and personalize three-dimensional piezo-resistive textile  that can conform to the human body through thermoforming principles with melting yarns. It results in a robust textile structure and intimate interfacing, suppressing sensor drifts and maximizing accuracy while ensuring comfortability. 

The digital knitting approach enables the fabrication of 2D to 3D pressure-sensitive textile interiors and wearables, including a 45 x 45 cm intelligent mat with 256 pressure-sensing pixels, and a circularly-knitted, form-fitted shoe with 96 sensing pixels across its 3D surface.  Furthermore, we have designed a visualization tool and a framework that treats the spatial sensor data as image frames.  Our personalized convolutional neural network (CNN) models are able to classify 7 basic activities and exercises and 7 yoga poses in-real time with 99.6% and 98.7% accuracy respectively. Further, we demonstrate our technology for a variety of applications ranging from rehabilitation and sport science, to wearables and gaming interfaces.

Copyright

Irmandy Wicaksono/MIT Media Lab

Copyright

Irmandy Wicaksono

We are motivated by the fact that most of our physical gestures and interactions involve contacts between different parts of our body and a surface. As we perform our daily activities such as walking, sitting, exercising or sleeping, a characteristic spatiotemporal contact and pressure pattern can be monitored and identified from sensing through the fabrics in our apparel or upholstery. In this project, we treat our spatiotemporal 2D pressure sensor data or heat-map similar to image frames. As we balance and redirect our center of mass through our feet, we exert force on the grounds. We demonstrate a deep learning-based CNN model that performs a real-time activity and posture recognition from our interactions with the textile surface with high accuracy without any complex pre-processing and feature extractions. By detecting the pressure distribution of the feet through our intelligent mat, we can extract and infer rich contextual information about our posture and activities. We presented two application scenarios: an intelligent mat connected to a virtual environment in order to gamify exercise and motivate users to move their body and play, as well as a real-time yoga posture recognition system. 

To develop a tubular knit textile, we employed a digital circular knitting machine and a combination of polyester, spandex, conductive, and TPU yarns in the knitting process. The machine greatly increases productivity because the relatively slow reciprocating motion of flat knit machines is replaced by a continuous and faster circular motion. The circular knitting is mostly used to make various tubular garments such as socks, shoes, sleeves, underwear or t-shirts. In order to realize form-fitting apparel or prosthetic lining customized to the wearer, 3D-printing and 3D-scanning of the human body could be performed to create 3D-printed models of the parts for thermoforming and shaping the tubular textile.

As one of the world's most-practiced sports, a significant research effort has been conducted to study the science behind soccer. We demonstrated the functionality of our 3D knitted sensing shoe or sock in this particular sport since it involves various biomechanical movements, including gait, balance, and coordination of muscles when running, sliding, and kicking a ball, as well as positioning of the ball on the shoe to ensure the right angle, power, and trajectory. This 3D knitted sensing shoe or sock could find many applications in prosthetics, kinesiology, rehabilitation, and sport science. 

Copyright

Irmandy Wicaksono/MIT Media Lab

Compared to the existing thin-film force-sensing and pressure-imaging technologies, our textile-based method is more seamless, breathable, comfortable, and intimate to the wearer, which could improve interfacial contact and accuracy of the sensing and recognition. Unlike camera-based systems that typically trigger privacy concerns regarding continuous, potentially invasive sensing and recognition, pressure-imaging approach is less intrusive and is not sensitive to line-of-sight or lighting levels. 3DKnITS process and technology can spark intelligent textile and ubiquitous computing applications spanning from biometrics and identification to robotics and HCI, creating new kinds of wearable technology and interactive environments.