Pattie Maes, MIT Media Lab
Joseph Paradiso, MIT Media Lab
Gil Weinberg, Georgia Tech
Throughout history we have augmented our physical abilities with machines. The earliest examples go back to the 13th century when flying machines and the ideas behind today's exoskeletons were first conceived. Today, as technology permeates every aspect of our lives, it is not hard to envision a much closer integration of machines into the tasks we carry out. This thesis explores a vision of humans and machines symbiotically working together on a task through co-action and co-agency. This vision investigates the many opportunities in between the extremes of autonomous robots and master-slave systems, exploring more complex systems that involve human and machine collaboratively performing an action and controlling the robotic extension.
This dissertation also describes different projects that illustrate the varied ways machine actions can be coordinated with our hands, and discusses case studies of such systems in artistic and musical domains. Specifically I report on three extensive experiments, each consisting of multiple iterations of actual, tested designs: a series of robotic extra-numerary finger robots for increasing manual dexterity, a series a collaborative human-drone drawing systems enabling novel expressive capability, and a series of semi-automated guitar systems enabling extended musical expression as well as new instrument-learning opportunities. The studies performed with these prototypes give insight into the impact of such robotic integration on the human user: the user is nudged to adapt to the new condition and re-calibrate their anticipation associated with a certain input action, the role division allows the user to explore and understand aspects outside their given skills or physical limits, and the robotic extension inspires practice outside their regular course.
Last, the thesis also provides a framework and corresponding terminology to situate different technical and design choices for these new forms of human-robot integration. The framework categorizes existing techniques based on how human and robotic actions are coordinated, and how the robotic movements are controlled. I also propose how we can qualitatively evaluate the interaction between human and machine, in terms of how a robotic extension may impact the cognition and behaviors of its user. The implemented prototypes and the experiments performed illustrate different design choices within the proposed framework, as well as the novel applications they enable.