This piece explores the threshold of leading / following in dance, allowing the participant to have a dialog with a technological being, using their bodies as the means of communication.
The user chooses a song by raising an arm and pointing to one of the seven original compositions. The computer uses a motion tracking interface and custom software to choose the song.
The user is invited to dance along with JitterBot, the red robot. The user's movement is tracked using the tracking interface and displayed on a projection surface as the green robot. Sometimes JitterBot follows the visitor's lead and sometimes JitterBot busts a move on its own, for the visitor to follow.
The installation is centered on using the Microsoft Kinect sensor which allows for real-time full-body 3d motion capture. A layer of custom software interprets the 3d skeleton of the viewer and renders a stylized human form, the green robot. The movement of the red robot is driven partially by the viewer's motion, but it is also driven by a library of canned moves to give a sense of autonomy. The two robots are rendered as a series of elongated spheres using OpenGL. The resultant video image is projected on a screen on the wall, and the accompanying music is played through speakers.
JitterBot uses uses OpenFrameworks, an open-source C++ library for creative coding.
The sources for KinectAnimation are available here: KinectAnimation.cpp, KinectAnimation.h.
I would like to thank Jennifer Lim for her help with this project. |