Tag: Interaction

Interaction Engineering (2021/22)

Last winter, my course Interaction Engineering was conducted on campus for the first part (autumn 2021) which really helped connecting the students. In November we had to switch to virtual meetings via Zoom but at this point most students already had a good idea what project to pursue with which partner.

This year we had a large variety of interaction modalities, from tangible interaction, via gaze and gesture control to full-body interaction. All projects resulted in actual prototypes that were presented live on 2 Feb 2022.

Click on the following screen shot to get to our project page where you can browse all projects. Each project comes with a short video, a written report and a set of slides.

A new Kinect (version 3)

Microsoft’s Kinect sensor can recognize human bodies and track the movement of body joints in 3D space. In 2019 an entirely new version called Azure Kinect DK was released by Microsoft. It is the third major version of the Kinect.

Originally, the Kinect was released 2010 (version 1, Xbox) and 2013 (version 2, Xbox One) but production was discontinued in 2017. However, Kinect technology was integrated for gesture control in the HoloLens (2016). While the Kinect failed to become a mainstream gaming controller, it was widely used for research and prototyping in the area of human-computer interaction.

The camera looks quite different from its earlier cousins.

In early 2022 we acquired the new Azure Kinect for the Interaction Engineering course at the cost of around 750 € here in Germany.

Setting up the Kinect

The camera has two cables, a power supply and a USB connection to a PC. You have to download an install two software packages:

  • Azure Kinect SDK
  • Azure Kinect Body Tracking SDK

It feels a bit archaic because you need to run executables in the console. For instance, it is recommended that you perform a firmware update on the sensor. For this, go into the directory of the Azure Kinect SDK and call “AzureKinectFirmwareTool.exe -Update <path to firmware>”. The firmware is in another directory of this package.

As a next step you go into the Azure Kinect Body Tracking SDK directory where you can start the 3D viewer. Again, this has one parameter so you cannot just click it in the file explorer. Type “k4abt_simple_3d_viewer.exe CPU” or “k4abt_simple_3d_viewer.exe CUDA” to start the viewer (in the /tools directory).

This is what you see (with the CPU version this is very slow).

Differences between Kinect versions

The new Kinect obviously improves on various aspects of the older ones. The two most relevant aspects are the field of view (how wide angled is the camera view) and the number of skeleton joints that are reconstructed.

FeatureKinect 1Kinect 2Kinect 3
Camera resolution640×4801920×10803840×2160
Depth camera320×240512×424640×576 (narrow)
512×512 (wide)
Field of viewH: 57°
V: 43°
H: 70°
V: 60°
H: 75° (narrow)
V: 65° (narrow)
H: 120° (wide)
V: 120° (wide)
Skeleton joints202632

There is an open-access publication dedicated to the comparison between the three Kinects:

Michal Tölgyessy, Martin Dekan, Ľuboš Chovanec and Peter Hubinský (2021) Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2. In: Sensors 21 (2). Download here.

Skeleton

Here is a schematic view of the joints that are recognized. In practice it turns out one has to put special attention to the robustness of the signal concerning hands, feet and also head orientation.

(Source: https://docs.microsoft.com/bs-latn-ba/azure/kinect-dk/body-joints)

To integrate the Kinect with a JavaScript program, e.g. using p5js, I recommend looking at the Kinectron project.

Links

Microsoft’s Azure Kinect product page

Azure Kinect documentation page

Course chapter on the Kinect (Interaction Engineering) in German

Wikipedia on the Kinect (very informative)

For developers

Kinectron (JavaScript, including p5js)

Azure Kinect DK Code Samples Repository

Azure Kinect Library for Node / Electron (JavaScript)

Azure Kinect for Python (Python 3)

Student projects: Interaction Engineering (2020/21)

You can now browse the Interaction Engineering projects of the “Corona” winter semester of 2020/21 under interaction.hs-augsburg.de/projects. This year a number of projects dealt with gestural, full-body and face interaction in various settings, from games to music and drawing. Team work was difficult under pandemic condition but the students handled these difficulties brillantly.

Click on the following screen shot to get to our project page where you can browse all projects. Each project comes with a short video, a written report and a set of slides.

Student projects: Interaction Engineering (2018/19)

Check out the latest Interaction Engineering team projects of last semester under interaction.hs-augsburg.de/projects.

15 students from all over the world with different backgrounds (computing, design, business, …) successfully completed the course and submitted a finished prototype. This year a number of projects dealt with reachability on mobile devices but we also saw gestural, touch and gaze interaction, one virtual reality project and interaction with a musical instrument.

Congratulations to all students for their excellent outcomes!

Click on the following screen shot to get to our project page where you can browse all projects. Each project comes with a short video, a written report and a set of slides.

Student projects: Interaction Engineering (2017/18)

Another round of fascinating  interaction engineering projects is completed. In this interdisciplinary course (computer science and design, Bachelor and Master students), we think up potential future human-computer interaction techniques based on current research publications.

This year we had 14 completed projects by 27 students. A new record after 12 projects of last year. Projects include interaction by gesture, full body, eye gaze, face, tangible object, Hololens and trampoline! We even had a Lego robot.

Check out all projects (video, report, slides) under

http://interaction.hs-augsburg.de/projects

Optical Tracking / Motion Capture

We have acquired new equipment for optical tracking and motion capture at our Interaction Lab, Augsburg University of Applied Sciences. If you know the Kinect you can think of a much more accurate and faster Kinect. Our system consists of 12 high-speed infrared cameras (1280×1024, 240 fps) and the respective software (Motive), all by OptiTrack. It is capable of recording human motion sequences (e.g. fight or dance moves or everyday actions like walking or picking up an object). Actors have to be set up with small retroreflective markers. The system emits infrared light and computes the 3D position of each marker using triangulation. One can also mark up arbitray objects (tea cup, sword, wand) using at least three markers (these markers are then defined as so-called rigid bodies).

The recordings of human motion can be used for character animation in CG movies or computer games. The same data can also be streamed in realtime to external applications for interaction scenarios.

In human-computer interaction, the tracking data is used to interact with a computer via gestures and/or with your full body, depending on the application. Tracking is also interesting for VR applications where you want to interact naturally with your hands and body. The system is much faster and more accurate than e.g. the Kinect. The latency (time from original movement to visible reaction on screen) is around 5ms (depending on the processing time of the output). The spatial accuracy is around 1mm.

Our research interests target gestural interaction, e.g. comparing it with other forms of interaction like touch, mouse and other controllers.

The system  is part of a research grant called Labor zur Erforschung und Entwicklung von Interaktionsmodellen mit Virtuellen Realitäten (laboratory for research and development on interaction models for virtual realities) by Rose, Müller, Rothaug, Kipp which is funded by Hochschule Augsburg.

Links

Mocap Lab Homepage

Multitouch 84

Update: I decided to return the monitor because the latency was too high. Latency is the time that passes between moving your finger and seeing a reaction on screen. This was somewhere between 150 and 180ms which made the device unusable for research purposes.

Please welcome our latest addition to our Interaction Lab at Augsburg University of Applied Sciences, a 84″ multitouch display with a motorized stand which can be transformed into a table, a tilted table and a wall.

The monitor is part of a research grant called Labor zur Erforschung und Entwicklung von Interaktionsmodellen mit Virtuellen Realitäten (laboratory for research and development on interaction models for virtual realities) by Rose, Müller, Rothaug, Kipp which is funded by Hochschule Augsburg.

We intend to investigate the question of how efficiency and ergonomics of multitouch interaction can be measured to compare various input modalities (mouse, controller, gesture, touch). See the publications below to get an idea of our goals and methods. The new display allows to extend our previous work to large display sizes and multi-party scenarios.

Some technical data:

  • 84″
  • 4K resolution (3840×2160)
  • 50 touch points

Related Publications

Lehmann, Florian (2016) Ergonomie von Multi-Touch Oberflächen, Bachelorarbeit, Studiengang Interaktive Medien, Hochschule Augsburg. | Bachelorarbeit |Präsentation | Poster | Read the blog post about this work

Nguyen, Q., and Kipp, M. (2015) Where to Start? Exploring the efficiency of translation movements on multitouch devices. In: Proceedings of 15th IFIP TC 13 International Conference (INTERACT), Springer, pp. 173-191.

Nguyen, Q., Kipp, M. (2014) Orientation Matters: Efficiency of translation-rotation multitouch tasks. In: Proc. of CHI 2014. Link to Video.

Student projects: Interaction Engineering (2016/17)

The next group of talented students completed their interaction engineering projects.  In this interdisciplinary course (computer science and design), we think up potential future human-computer interaction techniques based on current research publications. This semester there was a record-breaking set of 12 completed projects. Feel free to check them out by visiting the project website

http://interaction.hs-augsburg.de/projects

inteng2016w

Actuated Tangibles: ChainFORM

After the inFORM project (see my post from 2013) here is another spectacular research outcome from Professor Ishii’s Tangible Media Group at MIT.

The idea of tangible interaction goes back as far as 1997 when Ishii first formulated his idea of bringing back physical items to human-computer interfaces. He invented physical controls that allows you to manipulate digital data more intuitively.

Pushing this idea a step further Ishii wondered how to bring digital information back into the real world using actuated tangibles that can dynamically show the changes of the digital information. One problem is changing the position of physical controls (e.g. by air, vibration or magnetic control), more challenging is to change the shape of physical controls on the fly. Both inFORM and ChainFORM deal with the problem of changing shape dynamically.

Relevant Publications

Ken Nakagaki, Artem Dementyev, Sean Follmer, Joseph A. Paradiso, Hiroshi Ishii. ChainFORM: A Linear Integrated Modular Hardware System for Shape Changing Interfaces. In Proceedings of the 29th Annual ACM Symposium on User Interface Software & Technology (UIST ‘16).

Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge, and Hiroshi Ishii. 2013. inFORM: dynamic physical affordances and constraints through shape and object actuation. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST ’13). ACM, New York, NY, USA, 417-426.

Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human factors in computing systems (CHI ’97). ACM, New York, NY, USA, 234-241.

Bachelor Thesis: Florian Lehmann – Ergonomics of Multi-Touch Surfaces (2016)

Lehmann, Florian (2016) Ergonomie von Multi-Touch Oberflächen, Bachelorarbeit, Studiengang Interaktive Medien, Hochschule Augsburg. | Bachelorarbeit |Präsentation | Poster

0_tap

 

The investigation of multi-touch surfaces on smartphones is a relevant topic in the field of human-computer interaction. The main focus is to analyze and to understand touch input in detail, as well as delivering comprehensible insights for user interface designers and developers.

Continue reading

Copyright © 2024 Michael Kipp's Blog

Theme by Anders NorenUp ↑