Tag: Interaction

Student project: Reäktor (2012)

Reäktor is a media-enhanced dance performance. It is a 6th semester student project in the Interactive Media programme at Augsburg University of Applied Sciences (summer 2012).

“Light Visual Art Dance Performance” Was passiert wenn man Licht, Performancekunst, Tanz und moderne Technologien wie Mapping und Tracing verbindet? Eine der Antworten darauf ist Reäktor – eine an der Fakultät für Gestaltung der Hochschule Augsburg entwickelte und erdachte Tanzaufführung, die eine Symbiose aus Echtzeit 3D Tracking, abstrakten Visuals, moderner elektronischen Musik und klassischem Tanz schafft. Es soll gezeigt werden, dass sich die unterschiedlichen Elemente keinesfalls ausschließen. Der Aufführung liegt eine Geschichte zugrunde, die von Begegnungen zwischen Menschen handelt und wie deren Begegnungen und Berührungen für Spuren in deren Leben und Umgebung hinterlassen. Diese Spuren werden durch abstrakte Licht und Schattenspiele dargestellt, die experimentell durch die Verwendung moderner 3D-Tracking-Methoden entstehen. Das Stück richtet sich an ein kulturell gebildetes Publikum und Medienkunst-Liebhaber mit einem besonderen Interesse für Tanz und neue Formen der damit verbundenen Performance. Die ambitionierte Gruppe von acht Studenten unter der Leitung von Herrn Prof. Rose und Herrn Prof. Kipp der Hochschule Augsburg arbeitet seit einem halben Jahr an einem Konzept und einer Realisation dieser Tanzaufführung in der ebenfalls Tänzer des Augsburger Theater-Ensembles vertreten sein werden.

Student team: Matthias Lohscheidt, Christian Unterberg, Sven ten Pas, Thomas Ott, Benjamin Knöfler, Michael Klein, Dennis Schnurer, Florian Krapp

Supervision: Prof. Robert Rose, Prof. Dr. Michael Kipp

Student project: What the Face (2012)

What the Face is an interactive media installation about attention in a world submerged in media. It is a 6th semester student project in the Interactive Media programme at Augsburg University of Applied Sciences (summer 2012).

In Zeiten von Social Networks, Cross Media und Werbung an jeder Ecke sind wir durch die permanente Informationsflut abgestumpft. Es wird immer schwerer, die Aufmerksamkeit der Menschen zu gewinnen und sie möglichst lange zu binden. Extreme werden notwendig, um den Betrachter zu berühren und Verweildauer wird zum ausschlaggebenden Kriterium im Internet. Genau diese aktuellen Umstände in der Medienwelt repräsentiert what the face auf überspitzte Weise: Hier findet die Gegenüberstellung von Aufmerksamkeit heischender Masse und einzelnem Individuum auf einer direkten Ebene statt. Eine Wand aus 40 anonymisierten Gesichtern wirkt durch extreme emotionale Reaktionen auf den Besucher ein.

Student team: Evelin Kremer, Franziska Kästle, Johanna Krünes, Philippe Steinmayr, Cathrin Weihermann, Marlitt Messmann, Aren Danielian, Nadine Vogt

Supervision: Prof. Robert Rose, Prof. Dr. Michael Kipp

Webseite

Leap Motion v2: Much improved!

The leap motion controller lets you track your hands and fingers very much like Microsoft’s Kinect lets you track the whole body. This lets you create in-air gesture interfaces for e.g. controlling a robot hand (remote surgery), play games (shoot guns, fly planes) or musical instruments (pull strings).

However, when the leap motion device was released in 2013 it was quite a disappointment. The sensor plus software often loses track of individual fingers, usually when the hand is rotated (even by a small angle). This greatly limited its practical use in applications. It has never been clear whether this has been a software problem or a limitation of the hardware.

Now leap motion has released a new software which seems to indicate that this was a software problem. Even better: it has been fixed in the new v2 release. Look at this video. By the way, I’ve given it a try myself and the signals seem to be much, much more stable.

For downloading the beta version of the new software, go here: https://developer.leapmotion.com

 

Let’s Get Physical

A current direction in human-computer interaction is concerned with bringing back physical elements like real buttons, cubes, bricks, rulers etc. to the world of UI’s. This has been coined “Tangible User Interfaces” (TUIs) by Ishii from MIT (Ishii, Ullmer 1997). The idea is to exploit the physical properties of the object because they suggest a certain operations (round objects invite you to rotate them, buttons invite pressing them) and there is a sensual experience (haptic feedback) involved that is sometimes painful amiss in current touch interfaces.

Here’s an example that is based on the original ideas of Ulmer and Ishii (metaDESK):

Augmented Urban Model from Katja Knecht on Vimeo.

So in TUIs you control digital information (e.g. a desktop) by manipulating physical objects. However, what about the other way round? I.e. if the digital information changes, how can the physical objects be changed? This brings up a whole new set of potentials but also of technical challenges (Ishii et al. 2012).

The latest incarnation of this idea is inFORM, again from Ishii’s Tangible Media research group at MIT (Follmer et al. 2013):

In the arts the idea of bringing digital back to physical is manifest in “kinetic sculptures”, i.e. sculptures that change over time. Here is one impressive example at Singapore airport:

“Kinetic Rain” Changi Airport Singapore from ART+COM on Vimeo.

Literature

Hiroshi Ishii and Brygg Ullmer (1997) Tangible bits: towards seamless interfaces between people, bits and atoms. In: Proceedings of the ACM SIGCHI Conference on Human factors in computing systems (CHI ’97). ACM, New York, NY, USA, 234-241.

Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge, and Hiroshi Ishii (2013) inFORM: dynamic physical affordances and constraints through shape and object actuation. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST ’13). ACM, New York, NY, USA, 417-426.

Hiroshi Ishii, Dávid Lakatos, Leonardo Bonanni, and Jean-Baptiste Labrune. (2012) Radical atoms: beyond tangible bits, toward transformable materials. interactions 19, 1 (January 2012), 38-51.

Interfaces with the Kinect

Walking in 3D

Here, the Kinect is used to navigate through Google Street View:

  • body orientation => rotate sideways
  • body tilt => rotate up/down
  • walking (on the spot) => move forward
  • jump => move forward by some distance

Also see post on developkinect.com

In-Air Interaction with ARCADE

A combination of interaction techniques to create and control virtual 3D objects that are placed into the live video of the presenter.

  • selection/picking: hold hand over object
  • menu: browse with finger position, select with swipe gesture
  • drawing: two finger touching switches to draw mode, go to 3D rotate mode with a key press
  • rotate/scale: rotate with finger up/down/left/right, scale with two fingers
  • delete: wave gesture over object

Also see post on developkinect.com

Gesture Recognition: Kinetic Space 2.0

The toolkit allows to define your own 3D gestures.

Also see post on developkinect.com

Finger Tracking with DUO: Competition for the Leap Motion

Today I saw a video on the channel of the NUI Group which featured a DIY device for close range finger tracking, not unlike the Leap Motion device. It is called DUO and here’s what it can do:


The leap motion device should ship soon (April 2013) whereas the DUO is just about to start a Kickstarter project to collect funding. The difference between the two projects is foremost the open source and DIY philosophy of the DUO against a strictly commercial license philosophy of the Leap Motion. Technically, the two technologies seems to differ (see a forum thread of the DUO makers).

Homepage: http://duo3d.com

Gaze Interaction

While the current focus in HCI is sensor-based interaction (à la Kinect), recent developments could foster interaction with the eyes. The Danish company EyeTribe (formerly Senseye) is building a very nice tracking system with $2.3 million support from the Danish government. Partnering companies include the IT University of Copenhagen, DTU Informatics, LEGO and Serious Games Interactive.

EyeTribe plans to release an SDK for app development next year.

Copyright © 2024 Michael Kipp's Blog

Theme by Anders NorenUp ↑