Mobile Augmented Reality

The group focuses on tools and techniques for mobile interfaces presenting computer-generated information in real time and registered with a 3D environment.

Research Priorities

Augmented Reality (AR) can provide a new class of user interface experiences, in particular for mobile computing. AR is still a young research field and hence strongly driven by basic research and experimental methods. The true appeal of AR lies exactly in its deployment as a mobile user interface, available anytime and anywhere.

In the field of localization and tracking, AR requires tracking of the camera pose and the pose of objects in the environment, with high accuracy, six degrees of freedom and in real-time. It is also required that this works on a global scale and preferably with no or only minimal infrastructure. The primary approaches to achieve all this are computer vision techniques. We also consider fusion with other sensors, such as inertial measurement units, or GPS if available. While many techniques for tracking in AR have been investigated in the last years, no single methods is generally applicable, and many methods that deliver good performance do so only under constraining assumptions. What is more severe, an increase in efficiency of orders of magnitude is needed to make the more efficient techniques applicable on mobile devices.

In the field of rendering and visualization, AR requires photorealistic synthesis of video and real images. In addition, we require situated visualization: The AR display should highlight context-sensitive important information, reveal occluded, imperceptible or invisible information, provide evidence of the available 3D user interface widgets embedded into the real environment and feedback for the consequences of user interactions with such widgets. All of this must make take perceptual capabilities and limitations of the human user into account.

In the field of interaction, AR as a three-dimensional medium. Given appropriate knowledge of the physical environment, we can build a spatial interface for interacting naturally with the real world. We are only now starting to explore the true power of such interaction based on the recent technical developments such as real-time tracking that make the deployment practical. Naturally, all claims about the usability of interaction methods must be evaluated with real users.


  • Natural feature tracking
  • Simultaneous localization and mapping
  • Outdoor localization
  • Situated visualization
  • Authoring technology for augmented reality
  • Human factors and interaction for augmented reality


Behind our smart solutions and software are smart and driven people.
Do you want to meet the team?

Key Publications

Christoph Klug, Dieter Schmalstieg, and Clemens Arth. Measuring Human-made Corner Structures With a Robotic Total Station using Support Points, Lines and Planes. In Proc. International Conference on Computer Vision Theory and Applications (VISAPP), Porto, Portugal, February 2017.

Markus Tatzgern, Valeria Orso, Denis Kalkofen, Giulio Jacucci, Luciano Gamberini, Dieter Schmalstieg: Adaptive Information Density For Augmented Reality Displays. IEEE Virtual Reality (VR), 2016

Peter Mohr, Bernhard Kerbl, Michael Donoser, Dieter Schmalstieg, Denis Kalkofen: Retargeting Technical Documentation to Augmented Reality. ACM Conference on Human Factors in Computing Systems (CHI), pages 3337-3346, 2015

Nguyen Thanh, Reitmayr Gerhard, Schmalstieg Dieter: Structural Modeling from Depth Images. IEEE Transactions on Visualization and Computer Graphics, 2015.

Thanh Nguyen, Raphael Grasset, Dieter Schmalstieg, Gerhard Reitmayr: Interactive Syntactic Modeling With a Single-Point Laser Range Finder and Camera. Proc. IEEE ISMAR 2014.

Thanh Nguyen and Gerhard Reitmayr: Calibrating Setups with a Single-Point Laser Range Finder and a Camera. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2013.


Graz University of Technology
AVL List Gmbh

Current Project

Former Projects