Michael Kipp's Blog

Category: Research (page 2 of 4)

Android Wear is coming…

While we’re waiting for Google glasses the smart watch revolution might come first.

Let’s Get Physical

A current direction in human-computer interaction is concerned with bringing back physical elements like real buttons, cubes, bricks, rulers etc. to the world of UI’s. This has been coined “Tangible User Interfaces” (TUIs) by Ishii from MIT (Ishii, Ullmer 1997). The idea is to exploit the physical properties of the object because they suggest a certain operations (round objects invite you to rotate them, buttons invite pressing them) and there is a sensual experience (haptic feedback) involved that is sometimes painful amiss in current touch interfaces.

Here’s an example that is based on the original ideas of Ulmer and Ishii (metaDESK):

Augmented Urban Model from Katja Knecht on Vimeo.

So in TUIs you control digital information (e.g. a desktop) by manipulating physical objects. However, what about the other way round? I.e. if the digital information changes, how can the physical objects be changed? This brings up a whole new set of potentials but also of technical challenges (Ishii et al. 2012).

The latest incarnation of this idea is inFORM, again from Ishii’s Tangible Media research group at MIT (Follmer et al. 2013):

In the arts the idea of bringing digital back to physical is manifest in “kinetic sculptures”, i.e. sculptures that change over time. Here is one impressive example at Singapore airport:

“Kinetic Rain” Changi Airport Singapore from ART+COM on Vimeo.


Hiroshi Ishii and Brygg Ullmer (1997) Tangible bits: towards seamless interfaces between people, bits and atoms. In: Proceedings of the ACM SIGCHI Conference on Human factors in computing systems (CHI ’97). ACM, New York, NY, USA, 234-241.

Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge, and Hiroshi Ishii (2013) inFORM: dynamic physical affordances and constraints through shape and object actuation. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST ’13). ACM, New York, NY, USA, 417-426.

Hiroshi Ishii, Dávid Lakatos, Leonardo Bonanni, and Jean-Baptiste Labrune. (2012) Radical atoms: beyond tangible bits, toward transformable materials. interactions 19, 1 (January 2012), 38-51.

Noise Cancellation Visualization and Touch Interface

New noise cancellation device. Interesting visualization + touch interface.

See article on sciencedump

Kinect Project Kreek

Nice project where the Kinect is used to create a “depth touch” experience. One scenario allows to push through a body to see anatomy. Another one allows to creates valleys where balls are rolling into.

Klangfiguren // Kreek – Kinect controlled Interface from Lukas Hoeh on Vimeo.

Developed by students at the Köln International School of Design (KISD).

Round Smartphone: Roll Effect

A newly announced Samsung smartphone has a curved case. One interesting interaction technique is to roll it a little by pushing on one rim. The phone then assumes you want to see the time (and other information).

Projection Mapping

Stunning projection mapping and choreography!

Interfaces with the Kinect

Walking in 3D

Here, the Kinect is used to navigate through Google Street View:

  • body orientation => rotate sideways
  • body tilt => rotate up/down
  • walking (on the spot) => move forward
  • jump => move forward by some distance

Also see post on developkinect.com

In-Air Interaction with ARCADE

A combination of interaction techniques to create and control virtual 3D objects that are placed into the live video of the presenter.

  • selection/picking: hold hand over object
  • menu: browse with finger position, select with swipe gesture
  • drawing: two finger touching switches to draw mode, go to 3D rotate mode with a key press
  • rotate/scale: rotate with finger up/down/left/right, scale with two fingers
  • delete: wave gesture over object

Also see post on developkinect.com

Gesture Recognition: Kinetic Space 2.0

The toolkit allows to define your own 3D gestures.

Also see post on developkinect.com

Real-Time Facial Animation

Photorealism is slowly reaching a stage where the “uncanny valley” does not apply anymore. Activision just showed an impressive video of a face animated in real-time at GDC 2013 (Games Developers Conference):

One of the collaborators is Paul Debevec (USC – ICT, California), one of the superstars of the computer graphics community, inventor of the “light stage“, a device that allows to recreate a huge amount of light moods for a piece of video! Another project of his was “digital emily” (2008) which is probably the base technology behind the Activision demo. Here’s a talk at TEDx:

Anybody interested in facial animation should have a look at the German Animationsinstitut of the Filmakademie Baden-Württemberg who developed the Facial Animation Toolset based on scientific findings on facial expressions (most notably by researcher Paul Ekman who developed the facial action coding system aka FACS).


Alexander O., Rogers M., Lambeth W., Chiang M., Debevec P. Creating a Photoreal Digital Actor : The Digital Emily Project. IEEE European Conference on Visual Media Production (CVMP), November 2009. (also to appear in IEEE Computer Graphics and Applications.)

Paul Debevec, Tim Hawkins, Chris Tchou, Haarm-Pieter Duiker, Westley Sarokin, and Mark Sagar. 2000. Acquiring the reflectance field of a human face. In Proceedings of the 27th annual conference on Computer graphics and interactive techniques (SIGGRAPH 2000). ACM Press/Addison-Wesley Publishing Co., New York, NY, USA, 145-156.


Finger Tracking with DUO: Competition for the Leap Motion

Today I saw a video on the channel of the NUI Group which featured a DIY device for close range finger tracking, not unlike the Leap Motion device. It is called DUO and here’s what it can do:

The leap motion device should ship soon (April 2013) whereas the DUO is just about to start a Kickstarter project to collect funding. The difference between the two projects is foremost the open source and DIY philosophy of the DUO against a strictly commercial license philosophy of the Leap Motion. Technically, the two technologies seems to differ (see a forum thread of the DUO makers).

Homepage: http://duo3d.com

Creative Coding

It’s fascinating to see how many coding platform projects are dedicated to facilitating programming specifically for artists.

The following video presents three such projects. It features Processing (a Java derivative), Cinder (a C++ based framework) and OpenFrameworks (also C++). All of them are free and open source.

Let’s use this opportunity to post two examples of “creative coding”, both dealing with transformations of the human body. The first one is a video called “unnamed soundsculpture“, a work by onformative.

They used Kinects to record a dancer and used particle systems to transform the result. The making of is at least as interesting as the final result:

The second example is “Future Self”, a light sculpture, that works with sensor input about the position/pose of the observer.

Olderposts Newerposts

Copyright © 2020 Michael Kipp's Blog

Theme by Anders NorenUp ↑