Collaborators: David Robert, Natalie Freed
The Playtime Computing System is a technological platform that computationally models a blended reality interactive and collaborative media experience that takes place both on-screen and in the real world as a continuous space. On-screen audio-visual media (e.g., portraying virtual environments and characters – story world, etc.) have an extended presence into the physical environment using digital projectors, robotics, real-time behavior capture, and tangible interfaces. Player behavior is tracked using 3D motion capture as well as other sensors such as cameras and audio inputs.
Characters in this system can seemingly transition smoothly from the physical world to the virtual on-screen world through a physical enclosure that metaphorically acts as a portal between the virtual and the real. Any events or changes that happen to the physical character in the real world are carried over to the virtual world. Digital assets can be transitioned from the virtual to the physical world. These blended reality characters can either be programmed to behave autonomously, or their behavior can be controlled by the players.
My primary contribution was building the "trans-reality portal", the enclosure that transports the robot between physical and virtual representations. I also wrote the image stitching code that makes the eight projectors output a continuous environment, using a gaussian pattern from each projector and a single camera image capturing the scene to back-calculate the projector positions. This is where I had my first exposure to powerful realtime animation techniques through Touch Designer, under the guidance of David Robert. I learned a ton about setting up large audiovisual installations, exploiting graphics supercomputers, and building robot houses.
Rapid prototyping, Arduino, Motor control, motion capture (Vicon), Touch Designer, C