Tag: Kinect

Interaction Engineering (2021/22)

Last winter, my course Interaction Engineering was conducted on campus for the first part (autumn 2021) which really helped connecting the students. In November we had to switch to virtual meetings via Zoom but at this point most students already had a good idea what project to pursue with which partner.

This year we had a large variety of interaction modalities, from tangible interaction, via gaze and gesture control to full-body interaction. All projects resulted in actual prototypes that were presented live on 2 Feb 2022.

Click on the following screen shot to get to our project page where you can browse all projects. Each project comes with a short video, a written report and a set of slides.

A new Kinect (version 3)

Microsoft’s Kinect sensor can recognize human bodies and track the movement of body joints in 3D space. In 2019 an entirely new version called Azure Kinect DK was released by Microsoft. It is the third major version of the Kinect.

Originally, the Kinect was released 2010 (version 1, Xbox) and 2013 (version 2, Xbox One) but production was discontinued in 2017. However, Kinect technology was integrated for gesture control in the HoloLens (2016). While the Kinect failed to become a mainstream gaming controller, it was widely used for research and prototyping in the area of human-computer interaction.

The camera looks quite different from its earlier cousins.

In early 2022 we acquired the new Azure Kinect for the Interaction Engineering course at the cost of around 750 € here in Germany.

Setting up the Kinect

The camera has two cables, a power supply and a USB connection to a PC. You have to download an install two software packages:

  • Azure Kinect SDK
  • Azure Kinect Body Tracking SDK

It feels a bit archaic because you need to run executables in the console. For instance, it is recommended that you perform a firmware update on the sensor. For this, go into the directory of the Azure Kinect SDK and call “AzureKinectFirmwareTool.exe -Update <path to firmware>”. The firmware is in another directory of this package.

As a next step you go into the Azure Kinect Body Tracking SDK directory where you can start the 3D viewer. Again, this has one parameter so you cannot just click it in the file explorer. Type “k4abt_simple_3d_viewer.exe CPU” or “k4abt_simple_3d_viewer.exe CUDA” to start the viewer (in the /tools directory).

This is what you see (with the CPU version this is very slow).

Differences between Kinect versions

The new Kinect obviously improves on various aspects of the older ones. The two most relevant aspects are the field of view (how wide angled is the camera view) and the number of skeleton joints that are reconstructed.

FeatureKinect 1Kinect 2Kinect 3
Camera resolution640×4801920×10803840×2160
Depth camera320×240512×424640×576 (narrow)
512×512 (wide)
Field of viewH: 57°
V: 43°
H: 70°
V: 60°
H: 75° (narrow)
V: 65° (narrow)
H: 120° (wide)
V: 120° (wide)
Skeleton joints202632

There is an open-access publication dedicated to the comparison between the three Kinects:

Michal Tölgyessy, Martin Dekan, Ľuboš Chovanec and Peter Hubinský (2021) Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2. In: Sensors 21 (2). Download here.

Skeleton

Here is a schematic view of the joints that are recognized. In practice it turns out one has to put special attention to the robustness of the signal concerning hands, feet and also head orientation.

(Source: https://docs.microsoft.com/bs-latn-ba/azure/kinect-dk/body-joints)

To integrate the Kinect with a JavaScript program, e.g. using p5js, I recommend looking at the Kinectron project.

Links

Microsoft’s Azure Kinect product page

Azure Kinect documentation page

Course chapter on the Kinect (Interaction Engineering) in German

Wikipedia on the Kinect (very informative)

For developers

Kinectron (JavaScript, including p5js)

Azure Kinect DK Code Samples Repository

Azure Kinect Library for Node / Electron (JavaScript)

Azure Kinect for Python (Python 3)

Creative Coding

It’s fascinating to see how many coding platform projects are dedicated to facilitating programming specifically for artists.

The following video presents three such projects. It features Processing (a Java derivative), Cinder (a C++ based framework) and OpenFrameworks (also C++). All of them are free and open source.

Let’s use this opportunity to post two examples of “creative coding”, both dealing with transformations of the human body. The first one is a video called “unnamed soundsculpture“, a work by onformative.

They used Kinects to record a dancer and used particle systems to transform the result. The making of is at least as interesting as the final result:

The second example is “Future Self”, a light sculpture, that works with sensor input about the position/pose of the observer.

Facial Expression Replication in Realtime

The FaceShift software manages to use the depth image of the Kinect for a realtime replication of the speaker’s facial expression on an avatar’s face. Awesome. Look at the video to see how fast the approach is – hardly any visible latency between original motion and avatar and really subtle facial motions are translated to the virtual character. The software was developed by researchers from EPFL Lausanne, a research center with an excellent reputation, especially in the area of computer graphics. The software is envisioned to be used in video conferencing and online gaming contexts, to allow a virtual face-to-face situation while speaking.

Copyright © 2024 Michael Kipp's Blog

Theme by Anders NorenUp ↑