Mobile Augmented Reality

» Research » (Applications) Mobile Augmented Reality

Overview and System Architecture

Our work on mobile visual search is an important step towards mobile augmented reality, where objects in the viewfinder of the handheld device are continuously recognized and annotated without ever pushing a button. We demonstrated a first real-time system at ISMAR 2009. The handheld performs motion estimation at 30 fps and automatically selects a query frame when low motion suggests user interest. The query frame is sent to a recognition server and the result is returned within about a second. Using motion compensation, an annotation is superimposed at the correct location with the correct pose and it tracks the recognized object. We require uplink speeds > 1Mbps for an interactive experience, and our current demo uses WLAN. We expect to demonstrate the system over Stanford's 4G WiMax development network as soon as handsets are available in June.

Fast Features for Mobile Devices

In our current research, we aim to develop novel feature descriptors that require 100X less computation and are equally suitable for real-time tracking and recognition. This would allow mobile augmented reality at much lower uplink bitrate and would also help protect privacy since no video frames are sent to the server. Our initial results, which will appear in CVPR this June and SPIE Optical Engineering and Applications this August, are encouraging. We already have a first real-time implementation running on both Symbian and Android platforms that performs unified real-time tracking and recognition for a small database stored on the handheld device at video rates.

Mobile augmented reality system

Demonstration

Mobile Augmented Reality



Fast Features for Mobile Devices

Publications