We have acquired new equipment for optical tracking and motion capture at our Interaction Lab, Augsburg University of Applied Sciences. If you know the Kinect you can think of a much more accurate and faster Kinect. Our system consists of 12 high-speed infrared cameras (1280×1024, 240 fps) and the respective software (Motive), all by OptiTrack. It is capable of recording human motion sequences (e.g. fight or dance moves or everyday actions like walking or picking up an object). Actors have to be set up with small retroreflective markers. The system emits infrared light and computes the 3D position of each marker using triangulation. One can also mark up arbitray objects (tea cup, sword, wand) using at least three markers (these markers are then defined as so-called rigid bodies).

The recordings of human motion can be used for character animation in CG movies or computer games. The same data can also be streamed in realtime to external applications for interaction scenarios.

In human-computer interaction, the tracking data is used to interact with a computer via gestures and/or with your full body, depending on the application. Tracking is also interesting for VR applications where you want to interact naturally with your hands and body. The system is much faster and more accurate than e.g. the Kinect. The latency (time from original movement to visible reaction on screen) is around 5ms (depending on the processing time of the output). The spatial accuracy is around 1mm.

Our research interests target gestural interaction, e.g. comparing it with other forms of interaction like touch, mouse and other controllers.

The system  is part of a research grant called Labor zur Erforschung und Entwicklung von Interaktionsmodellen mit Virtuellen Realitäten (laboratory for research and development on interaction models for virtual realities) by Rose, Müller, Rothaug, Kipp which is funded by Hochschule Augsburg.

Links

Mocap Lab Homepage