Imitation and motion classification

A model of movement imitation and implementation for a simulated humanoid to imitate the behavior of a human performer. The tracking data used for this implementation were obtained using a 2.5D upper body video-based pose tracking system. The attention mechanism in this system focuses on the locations of the endpoints (i.e. hands). In place of a learned set of primitives, a human subject performed a sequence of motions, including line, circle, and arc trajectories of the endpoints, to yield a set movements that serve as a set of perceptual-motor primitives. Using these primitives, we implemented a vector quantization based classification mechanism. With postprocessing of the classification results, the classifier provides a desired via-point trajectory for each arm endpoint. These trajectories are then actuated using impedance control on our 20 DOF humanoid simulation, Adonis.