Last update: 11 Nov 2021

6.1 Introduction

Tangible user interfaces (TUI) are interfaces where users place and move physical objects to access and change digital information. Professor Ishii developed this idea in a seminal paper called Tangible Bits (Ishii, Ullmer 1997) at MIT.

While computer screens can display anything, this can also be a downside when unexpected things happen and users feel lost in a sea of artificial looks and behaviors. The basic idea of tangible interfaces is that one can exploit some important characteristics of physical objects to make human-computer interaction more intuitive, effective and enjoyable:

  1. Affordances: This means that a physical object can indicate by its shape how it should be used (a disc should be turned)
  2. Physical constraints: Additional constraints, e.g. for motion, can be added (like for a physical slider)
  3. Aesthetics: Physical instruments have a tradition of being specifically designed to please the eye or to fit in a certain design framework

One of the earliest tangible user interfaces was the Marble Answering Machine, developed by Durrell Bishop, a student at the Royal College of Art, in 1992.

6.1.1 Navigating with Physical Icons (1997)

One of the best demonstrations is still the metaDESK table (from 1997) where users can navigate a map of the MIT university campus by placing small physical models of buildings on the table and moving them around.


Ishii's Tangible Media Group at MIT is still the best place to watch for innovations in this area.

6.1.2 reacTable for Music Generation (2003)

The reacTable is a round table with a tangible user interface to control music generation. It was developed in 2003 and presented at various conferences and festivals, e.g. at SIGGRAPH (Jordà et al. 2006). It was probably one of the most influential tangible interface projects outside Ishii's MIT group.


Members of the group around reacTable also developed a framework called reacTIVision and a protocol/API for tangible multitouch surfaces called TUIO.

6.2 Active Tangibles

The original concept of tangible user interfaces used "passive" tangibles. With such tangibles a user can manipulate a digital "world" but what if the digital world changes: Can the corresponding tangible move? With passive tangibles this is not the case the direction of change is unidirectional.

Therefore, a number of projects were concerned with making tangible move. Another step is making tangibles transform (change shape).

6.2.1 PICO (2006/07)

The PICO project Patten and Ishii (2007) used little round pucks as active tangibles which can be moved electromagnetically. Thus, changes on the digital side can be reflected by movement of the tangibles.

6.2.2 ZeroN (2011)

In ZeroN the movement of a tangible is taken to 3D by making a sphere lavitate in space (Lee et al. 2011).

6.2.3 inFORM (2013)

In the inFORM project Lee et al. (2013) suggested a transformable input/output device using movable pins. As a result the device can not only display 3D shapes but also be manipulated in various ways.

6.3 Future of Tangible Interfaces: Transformable Materials

Ishii et al. (2013) formulated a vision called Radical Atoms where they see a future of shape-changing, transformable materials that would allow to bring the flexibility of digital operations to the real world. Imagine having typical photo-processing operations like snap-to-grid or re-coloring at your disposal when operating on a piece of clay.

The authors visualized their ideas in the following video.

6.4 How to Implement TUIs

Implementing a tangible user interface means that physical objects must be recognized and tracked. This can be done by camera and computer vision (supported by markers or colors). Alternatively, the objects can contain sensors (accelerometers, inertial, pressure, light) and transmit the information wirelessly or by cable.

6.4.1 Fiducial Marker Tracking with reacTIVision

Martin Kaltenbrunner developed a system called reacTIVision which provides such a recognition and tracking system. See https://github.com/mkalten/reacTIVision or have a look at this paper Kaltenbrunner/Bencina (2007).

Components

You need the following components to make this work:

  1. printed markers (on paper)
  2. a software for tracking markers with a camera
  3. a software for receiving the positions/orientations of the markers

You find everything here: http://reactivision.sourceforge.net

Markers and Tracking App

Download the reacTIVision software for

  1. markers as PDF and
  2. the tracking software (an .exe or .app file)

Then, try it out: Print out one page of markers, start the software and hold the markers in front of your notebook camera. You should see green numbers (IDs) on top of the markers.

Client

The reacTIVision application sends the position/orientation of the tracked markers to any TUIO client that is listening on port 3333. You can simply download and install a client, e.g. for Processing, and start it. When your reacTIVision app is running and is tracking some markers you should see something on your TUIO client app. You find the client also under the above sourceforge link.

Note for the Processing client: You install a Processing library by moving the directory from the downloaded ZIP file (called TUIO) to your Processing sketchbook (a folder on your hard drive). If you are not sure where this is located, open your Processing preferences where you can look up the path.

6.4.2 Touch Surface and Pins

The following project demonstrated how objects can be instrumented with touch-sensitive pins to make their position (and rotation) recognizable via a touch surface.

6.4.3 Printed Paper and Markers

This project is quite interesting because it uses simple paper, printed with patterns, to create some tangible interactions.

6.5 References

Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge, and Hiroshi Ishii (2013) InFORM: dynamic physical affordances and constraints through shape and object actuation. In: Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST '13). ACM, New York, NY, USA, 417–426.

Hiroshi Ishii, Dávid Lakatos, Leonardo Bonanni, and Jean-Baptiste Labrune (2012) Radical atoms: beyond tangible bits, toward transformable materials. In: interactions 19, 1 (January + February 2012), 38–51.

Hiroshi Ishii and Brygg Ullmer (1997) Tangible bits: towards seamless interfaces between people, bits and atoms. In: Proceedings of the ACM SIGCHI Conference on Human factors in computing systems (CHI '97). Association for Computing Machinery, New York, NY, USA, 234–241.

Sergi Jordà, Martin Kaltenbrunner, Günter Geiger, and Marcos Alonso (2006) The reacTable: a tangible tabletop musical instrument and collaborative workbench. In: ACM SIGGRAPH 2006 Sketches (SIGGRAPH '06). ACM, New York, NY, USA.

Martin Kaltenbrunner and Ross Bencina (2007) reacTIVision: a computer-vision framework for table-based tangible interaction. In: Proceedings of the 1st international conference on Tangible and embedded interaction (TEI '07). ACM, New York, NY, USA, 69-74.

Jinha Lee, Rehmi Post, and Hiroshi Ishii (2011) ZeroN: mid-air tangible interaction enabled by computer controlled magnetic levitation. In: Proceedings of the 24th annual ACM symposium on User interface software and technology (UIST '11). ACM, New York, NY, USA, 327–336.

James Patten and Hiroshi Ishii (2007) Mechanical constraints as computational constraints in tabletop tangible interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, New York, NY, USA, 809–818.