Back to projects
2024

Grasping 3D Space

Alexander Wirtz, Marcella Klauser
When looking up online tutorials for various 3D software, chances are the first part of many of the various series available will be about camera control and navigation in 3D. Given how humans navigate 3D at basically all times in the real world, it might seem a surprise how this needs to be specially addressed. The problem lies that in the real world us humans have access to a great number of joints and limbs to control, whereas when working on a computer, movement is generally limited to the two-dimensional plane of a mouse pad. These two degrees of freedom do of course not cover the six required to have complete control over 3D movement, consisting of movement in the three dimensions, as well as rotation around each of these axes. To combat this, most programs offer control over two axes at a time, such as panning along the view plane and at the press of a button, rotation around a virtual point. Which button to press however, is not conventionalized and often varies from program to program, thus creating a need for the aforementioned video tutorials on camera control. The idea behind this project was to create an alternative input method which allows simultaneous control over all six degrees of freedom. This is achieved by tracking a tangible object using ArUco markers, the rotation and position of which are interpreted using Python and finally applied to a Blender viewport to control the camera. This allows for intuitive usage as a virtual object on the screen will match the movements of the tangible, which can be freely handled by the user. This very direct approach to mapping is supposed to especially help inexperienced users.
Interaction
tangible
Technology
camera, fiducial markers