Michael Kipp's Blog

Category: Research (page 1 of 4)

Multitouch 84

Please welcome our latest addition to our Interaction Lab at Augsburg University of Applied Sciences, a 84″ multitouch display with a motorized stand which can be transformed into a table, a tilted table and a wall.

The monitor is part of a research grant called Labor zur Erforschung und Entwicklung von Interaktionsmodellen mit Virtuellen Realitäten (laboratory for research and development on interaction models for virtual realities) by Rose, Müller, Rothaug, Kipp which is funded by Hochschule Augsburg.

We intend to investigate the question of how efficiency and ergonomics of multitouch interaction can be measured to compare various input modalities (mouse, controller, gesture, touch). See the publications below to get an idea of our goals and methods. The new display allows to extend our previous work to large display sizes and multi-party scenarios.

Some technical data:

  • 84″
  • 4K resolution (3840×2160)
  • 50 touch points

Related Publications

Lehmann, Florian (2016) Ergonomie von Multi-Touch Oberflächen, Bachelorarbeit, Studiengang Interaktive Medien, Hochschule Augsburg. | Bachelorarbeit |Präsentation | Poster | Read the blog post about this work

Nguyen, Q., and Kipp, M. (2015) Where to Start? Exploring the efficiency of translation movements on multitouch devices. In: Proceedings of 15th IFIP TC 13 International Conference (INTERACT), Springer, pp. 173-191.

Nguyen, Q., Kipp, M. (2014) Orientation Matters: Efficiency of translation-rotation multitouch tasks. In: Proc. of CHI 2014. Link to Video.

Actuated Tangibles: ChainFORM

After the inFORM project (see my post from 2013) here is another spectacular research outcome from Professor Ishii’s Tangible Media Group at MIT.

The idea of tangible interaction goes back as far as 1997 when Ishii first formulated his idea of bringing back physical items to human-computer interfaces. He invented physical controls that allows you to manipulate digital data more intuitively.

Pushing this idea a step further Ishii wondered how to bring digital information back into the real world using actuated tangibles that can dynamically show the changes of the digital information. One problem is changing the position of physical controls (e.g. by air, vibration or magnetic control), more challenging is to change the shape of physical controls on the fly. Both inFORM and ChainFORM deal with the problem of changing shape dynamically.

Relevant Publications

Ken Nakagaki, Artem Dementyev, Sean Follmer, Joseph A. Paradiso, Hiroshi Ishii. ChainFORM: A Linear Integrated Modular Hardware System for Shape Changing Interfaces. In Proceedings of the 29th Annual ACM Symposium on User Interface Software & Technology (UIST ‘16).

Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge, and Hiroshi Ishii. 2013. inFORM: dynamic physical affordances and constraints through shape and object actuation. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST ’13). ACM, New York, NY, USA, 417-426.

Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human factors in computing systems (CHI ’97). ACM, New York, NY, USA, 234-241.

Microsoft Vision Video 2020

Another “Future vision” video from Microsoft that contains snippets of older vision videos. In this video, they added a gesture-controlled bracelet/smart watch (0:17) and a 3D holographic display (0:27).

See my older post for more videos.

 

Ontenna: Sound to Vibration Device

Tatsuya Honda developed a device that translates sound to vibrations (and light), making it possible for Deaf people to react to environmental sounds and even to differentiate between different sounds. The device is worn as a hairclip as can be seen in the video below. It is yet a prototype.

Read article on venturebeat.com

 

City Pulse: Circular multi-screen installation with gesture control

A new installation on the 100th floor of the 1 World Trade Center features circular screens with gesture control.

The project was realized by Local Projects.

Leap Motion v2: Much improved!

The leap motion controller lets you track your hands and fingers very much like Microsoft’s Kinect lets you track the whole body. This lets you create in-air gesture interfaces for e.g. controlling a robot hand (remote surgery), play games (shoot guns, fly planes) or musical instruments (pull strings).

However, when the leap motion device was released in 2013 it was quite a disappointment. The sensor plus software often loses track of individual fingers, usually when the hand is rotated (even by a small angle). This greatly limited its practical use in applications. It has never been clear whether this has been a software problem or a limitation of the hardware.

Now leap motion has released a new software which seems to indicate that this was a software problem. Even better: it has been fixed in the new v2 release. Look at this video. By the way, I’ve given it a try myself and the signals seem to be much, much more stable.

For downloading the beta version of the new software, go here: https://developer.leapmotion.com

 

Virtual arms: Combining Oculus Rift with Myo

Two arms are projected into Oculus Rift goggles and controlled by the Myo wristband controller (uses muscle activity). Developed by Thalmic Labs, the makers of Myo.

Android Wear is coming…

While we’re waiting for Google glasses the smart watch revolution might come first.

Let’s Get Physical

A current direction in human-computer interaction is concerned with bringing back physical elements like real buttons, cubes, bricks, rulers etc. to the world of UI’s. This has been coined “Tangible User Interfaces” (TUIs) by Ishii from MIT (Ishii, Ullmer 1997). The idea is to exploit the physical properties of the object because they suggest a certain operations (round objects invite you to rotate them, buttons invite pressing them) and there is a sensual experience (haptic feedback) involved that is sometimes painful amiss in current touch interfaces.

Here’s an example that is based on the original ideas of Ulmer and Ishii (metaDESK):

Augmented Urban Model from Katja Knecht on Vimeo.

So in TUIs you control digital information (e.g. a desktop) by manipulating physical objects. However, what about the other way round? I.e. if the digital information changes, how can the physical objects be changed? This brings up a whole new set of potentials but also of technical challenges (Ishii et al. 2012).

The latest incarnation of this idea is inFORM, again from Ishii’s Tangible Media research group at MIT (Follmer et al. 2013):

In the arts the idea of bringing digital back to physical is manifest in “kinetic sculptures”, i.e. sculptures that change over time. Here is one impressive example at Singapore airport:

“Kinetic Rain” Changi Airport Singapore from ART+COM on Vimeo.

Literature

Hiroshi Ishii and Brygg Ullmer (1997) Tangible bits: towards seamless interfaces between people, bits and atoms. In: Proceedings of the ACM SIGCHI Conference on Human factors in computing systems (CHI ’97). ACM, New York, NY, USA, 234-241.

Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge, and Hiroshi Ishii (2013) inFORM: dynamic physical affordances and constraints through shape and object actuation. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST ’13). ACM, New York, NY, USA, 417-426.

Hiroshi Ishii, Dávid Lakatos, Leonardo Bonanni, and Jean-Baptiste Labrune. (2012) Radical atoms: beyond tangible bits, toward transformable materials. interactions 19, 1 (January 2012), 38-51.

Noise Cancellation Visualization and Touch Interface

New noise cancellation device. Interesting visualization + touch interface.

See article on sciencedump

Olderposts

Copyright © 2017 Michael Kipp's Blog

Theme by Anders NorenUp ↑