Sang-won Leigh

Fluid Interfaces
  • Research Assistant

Exploring the blend of the physical and the digital, humans and machines, Sang-won researches alternate notions of the reality and the human body. His ultimate goal is to eliminate the chasm between the immutable nature of Newtonian universe and the malleability of the digital computing, through which he pushes the envelope of user interfaces and augmented reality. He currently focuses on the integration of the human body with computational ones, redefining what we are beyond our given genetic information.

His creations include THAW – a magic lens UI that combines smartphones and computer screens, Remnance of Form – an art installation that turns a shadow into living, transmutable forms, A Flying Pantograph – a flying robot as an extension of an artist’s hand, and a wearable robotic device – that gives us extra limbs. His works have been presented in prestigious academic conferences (CHI, UIST, TEI, UbiComp) and major media (BBC, WIRED, Fast Company, Engadget, Huffington Post).

Before joining MIT Media Lab, he was a software engineer at Samsung Electronics where he led the software development of eyeCan, an open-source D… View full description

Exploring the blend of the physical and the digital, humans and machines, Sang-won researches alternate notions of the reality and the human body. His ultimate goal is to eliminate the chasm between the immutable nature of Newtonian universe and the malleability of the digital computing, through which he pushes the envelope of user interfaces and augmented reality. He currently focuses on the integration of the human body with computational ones, redefining what we are beyond our given genetic information.

His creations include THAW – a magic lens UI that combines smartphones and computer screens, Remnance of Form – an art installation that turns a shadow into living, transmutable forms, A Flying Pantograph – a flying robot as an extension of an artist’s hand, and a wearable robotic device – that gives us extra limbs. His works have been presented in prestigious academic conferences (CHI, UIST, TEI, UbiComp) and major media (BBC, WIRED, Fast Company, Engadget, Huffington Post).

Before joining MIT Media Lab, he was a software engineer at Samsung Electronics where he led the software development of eyeCan, an open-source DIY eye-mouse designed for people with motor disability. This project became the foundation of Samsung’s C-LAB. The eyeCan project was covered by major newspapers in Korea, also he was invited to give talks in TEDx events, Seoul Digital Forum, and Tech Plus Forum. He received his Bachelor and Master of Science from KAIST, focusing on 3D Computer Vision and Machine Learning.

He is now a PhD student at Fluid Interfaces Group of MIT Media Lab, working with Pattie Maes.