Can a robot and magician collaborate on stage to create a believable, evocative performance? Close human-robot proximity and coordination on a performance stage is a recent development (rapid passing of objects between human hands and robot grippers). Our tools allow us to compose a human-robot performance that blends pre-rendered choreography with key moments of dynamic interactivity to enhance the realism of the character. For example, as the robot is playing back a series of poses, it might also track the face of the performer to maintain eye contact. We are studying how perceived agency and blended static/dynamic interactivity might affect an audience's perception of the performance and how changes in computational robot choreography might also influence a viewer's emotional state. We have built trajectory timeline composition software, a sympathetic interface to an industrial robot, and custom hardware to achieve magic effects.