20 Hard Things About AR: Motion-to-Photon Latency

December 13, 2017

From a user's perspective, augmented reality may seem easy. Under the hood, however, there are a number of underlying technologies necessary to create an integrated wearable AR experiences. In this mini-series, we’re pulling back the curtain with DAQRI’s experts to provide you with a better understanding of what it takes to deliver Professional Grade AR™. We recently discussed the challenges of fluid tracking and of making the virtual content you see as real as possible. In this episode, we focus on the importance of having low motion-to-photon latency.


Motion-to-photon latency is a very important measurement in determining the quality of a computer vision system.

It is the measurement of the time it takes for light to enter a camera, get processed by a computer vision algorithm, generate a pose, render digital content, and display it as augmented reality so that it can be seen by the eye. The time it takes from light entering the camera to AR hitting your eye is the motion-to-photon latency, and,

“Reducing this latency is one of the most important, yet difficult problems in delivering Professional Grade AR.”

If the latency is too high, the content won’t match up very well...things will look “off” and your body might perceive a motion that isn’t matching up with what you’re seeing. This phenomenon happened in the early days of Virtual Reality when many early adopters reported experiences of motion sickness and nausea.

Because we’ve designed DAQRI Smart Glasses from the bottom up, we’ve achieved industry-leading motion-to-photon latency. As a result, the negative feeling of motion sickness now seems like yesterday’s news, and you will experience the very positive feeling when your digital content appears in the real world, in the right place. No matter where you look, your digital content just stays there, right where you need it...allowing you to wear the glasses for your whole shift so you can achieve more and do your work better.