LiDAR: paving the way to AR glasses

Apple announced a new iPad Pro this week and, looking beyond my own giddiness for the new keyboard with built-in trackpad, it has a new hardware component which focuses on augmented reality: LiDAR.

For all intents and purposes, LiDAR is a stylised port-manteau of light and radar, and is a 3D laser scanning technology. In short, it can scan and create a map of your close environment, a clear advantage for AR applications.

AR LiDAR demo from Apple iPad Pro announcement
Source: Apple

iPad has often been the testing ground for new technologies, it received LTE connectivity before iPhone, and it adopted Apple's own silicon, the Apple A-series chips, before iPhone, too. I have a feeling the inclusion of LiDAR on iPad is no different: this is a technology that is ultimately destined to be used in a different product down the line, like AR glasses.

I hinted at this in a previous article:

The advantage of glasses over a phone is that your eyes can be the viewfinder, so a possible solution is using a depth-sensing camera only—recognition of the environment without being intrusive.

LiDAR is the depth-sending tech I was after, it allows recognition of the environment without needing a classic camera and, as Jay Peters from the Verge notes, it works in real time.

With LiDAR potentially being the perfect technology for AR glasses, one of the final puzzles is how to display the data that it captures. In other words, how the screen of a pair of AR glasses will work: prism projector screen like Google Glass; a low-powered laser straight into the eye, like the Intel Vaunt; or something else?