
As far as they would have known, I was filming them. While I was using Live View, I could see the people I was walking towards visible recoil when they recognised that I was using my phone’s camera. Once it did determine my exact location though, the AR directions worked perfectly. I had to spin around a few times and wander to and fro to give it a helping hand. The main issue was that Maps took around half a minute to figure out exactly where I was through my phone’s camera. I tried the feature out on a Google Pixel 3 this morning, and though it wasn’t quite as slick as I’d hoped (that’s understandable, as Google says it’s still in development), its potential is obvious. As such, Live View only works in good light conditions. Live View requires access to your camera, and In order to use Live View you need to hold your phone vertically, so that Maps can analyse and (hopefully recognise) your immediate surroundings.

It’s the best new addition to the app in years, but be warned − it’ll probably creep out passers by.

Live View is an extremely useful new Google Maps feature that guides you with augmented reality directions that are laid out on top of the real world, and Google has just started rolling it out to more users.
