Project

Designing Robots in Mixed Reality

Groups

This is a joint proposal from graduate students Vik Parthiban (Media Lab) and Andrew Spielberg (MIT CSAIL, Distributed Robotics Laboratory). 

With the release of the Magic Leap Creator and Leap Motion NorthStar platform for new mixed-reality and augmented-reality applications, we propose a new computer-aided design (CAD) tool for simulating and deploying robots. Our goal is to better understand locomotion in 3D space and diagnose character movement under multiple constraints. 

We will develop the "Magic Leap Design and Control Toolbox," a suite of new interactive algorithms and implementations to directly build structures within the environment using captured information about the world. A new system for gesturing the desired path or degrees of freedom of an object will automatically translate the gestures into robot and character control inputs. The system will be based on algorithms that synthesize natural gestural control based on the desired motion.

This is a joint proposal from graduate students Vik Parthiban (Media Lab) and Andrew Spielberg (MIT CSAIL, Distributed Robotics Laboratory). 

With the release of the Magic Leap Creator and Leap Motion NorthStar platform for new mixed-reality and augmented-reality applications, we propose a new computer-aided design (CAD) tool for simulating and deploying robots. Our goal is to better understand locomotion in 3D space and diagnose character movement under multiple constraints. 

We will develop the "Magic Leap Design and Control Toolbox," a suite of new interactive algorithms and implementations to directly build structures within the environment using captured information about the world. A new system for gesturing the desired path or degrees of freedom of an object will automatically translate the gestures into robot and character control inputs. The system will be based on algorithms that synthesize natural gestural control based on the desired motion.

Copyright

Leap Motion

Copyright

Magic Leap

Furthermore, for development, diagnostic information will be displayed above virtual characters’ and robots’ actions, allowing designers quickly to understand the control relationships between environmental stimuli and character behavior. Gaze-tracking technologies will improve accuracy of selection and control. Optimization and reinforcement learning algorithms will recommend ways to improve designs and recommend designs' form/function, creating a natural iteration cycle for the user.

For more information, please contact Vik at vparth@mit.edu.