Project

SensorNets

Copyright

Irmandy Wicaksono

 Irmandy Wicaksono

SensorNets: Towards Reconfigurable Multifunctional Fine-grained  Soft and Stretchable Electronic Skins.

SensorNets is a bioinspired electronic skin integrated with multimodal sensor networks for interactive media applications, from wearables, self-aware objects, to intelligent environments. It is developed by connecting miniaturized flexible printed circuit boards as two-dimensional sensor arrays with stretchable interconnects. The system is embedded in between soft deformable layers, such as textiles or rubbers. The result is a soft sensate surface that can be distributed and conformally wrap and adapt to curved structures. Each node contains a microprocessor together with a collection of nine sensors and a light-emitting diode, providing multimodal data that can be used to detect various deformation, proxemic, tactile, and environmental changes. We show that the electronic skin can sense and respond to a variety of stimuli simultaneously, as well as open up a possibility for sensor-rich virtual and augmented reality-based visualization and interaction.

SensorNets: Towards Reconfigurable Multifunctional Fine-grained  Soft and Stretchable Electronic Skins.

SensorNets is a bioinspired electronic skin integrated with multimodal sensor networks for interactive media applications, from wearables, self-aware objects, to intelligent environments. It is developed by connecting miniaturized flexible printed circuit boards as two-dimensional sensor arrays with stretchable interconnects. The system is embedded in between soft deformable layers, such as textiles or rubbers. The result is a soft sensate surface that can be distributed and conformally wrap and adapt to curved structures. Each node contains a microprocessor together with a collection of nine sensors and a light-emitting diode, providing multimodal data that can be used to detect various deformation, proxemic, tactile, and environmental changes. We show that the electronic skin can sense and respond to a variety of stimuli simultaneously, as well as open up a possibility for sensor-rich virtual and augmented reality-based visualization and interaction.