AR Google Maps – Lessons for our Future Smart Glasses

At the Google I/O conference last spring, we were promised AR Google Maps – augmented reality directions on the one app we all find indispensable. It’s been a long wait, and it’s not over yet. But the augmented reality enhancement to Maps is finally rolling out to Local Guides. We may even see it by the next I/O conference (but don’t hold your breath).

AR Google Maps Guiding Fox

And what of that augmented reality fox in the original version? It looks like Google has opted for the more tried-and-true directional arrows. But give it a little time, and we’ll all personalize our AR walking/driving guides with the same attention we give to our phone cases.

Some of us will opt for Disney characters. Others will fixate on celebrities. I have some days when nothing short of a virtual Dante would be the appropriate guide.

The project to reinvent Maps with augmented reality is a sign of what’s to come in our smart glasses. And Google is already facing some of the challenges we’ll have to address when AR Wearables arrive on the market. Should the data be live all the time? How do you balance the AR elements with the environment around you? How do you turn it on and off?

AR Google Maps

First, a quick look at the new AR feature on the maps from Engadget.

Ironically, the very feature that undermines Maps in densely built urban areas turns out to be essential for AR functionality in Google Maps. Spend any amount of time using GPS-based maps around tall buildings and you’ll find the little blue location dot wandering off as if it had a life of its own.

But there’s a wealth of data in Google Street View that can serve as AR position markers. Hold up your phone and machine learning will read your location using the buildings with a precision current Google Maps could never deliver.

There’s no better move in any tech development than to take your obstacles and turn them into assets.

The Challenges for AR Glasses

Wisely (more likely, fear of lawsuits), AR Google Maps will warn you in a few moments to lower your phone after holding it up to determine your location. Not only would you look foolish, but you’d end up bumping into people or objects . . . or worse. Instead of texting and walking, we’ll have mapping and walking. No doubt, it will lead to more YouTube video compilations (always a news media favorite) on cell phone fails.

Warning message in AR Google Maps
Lower your phone, you fool. The warning message in AR Google Maps.

But what happens when AR Glasses arrive? Laws against mobile phone use while driving become pointless. Forty-seven states have laws that prohibit texting while driving. While smart glasses will help you keep your eyes on the road, it hardly guarantees your attention. If the screen below took up your entire eyeglasses, would you notice the cars and pedestrians or focus on the text and the floating multicolored dots?

AR Google Maps screen

With a phone, device management is relatively simple. Put the phone down, and you’re back to standard Maps. It’s much more problematic when AR is embedded in your eyewear. Should your smart glasses turn off after a few moments? Fine, except for the times when you need persistent augmented data – entertainment or learning experiences.

And here’s where design issues become incredibly complex. For mapping purposes, you may actually want a mix of both persistent and nonpersistent data. Having routing and turn indications always available is useful. Having details on every coffee house when you only want the downtown Blue Bottle is pointless. Smart glasses will need to be smart, offering information on a need to know basis depending on the task.

Since we can’t put smart glasses down, we’ll either be using voice commands or tapping the frames (though a ring control device could be another alternative).

The challenge of designing usable and fashion-friendly smart glasses is only the beginning. Vuzix has been at it for years. ODG failed spectacularly. Nreal is the new kid on the block.

But as AR Google Maps reveals, the more difficult challenge will be how we manage the firehose of information available to us when it is literally right in front of our eyes.