Google Unveil AR Visual Navigation
Google integrate Maps with a smartphone Camera to create a Visual Positioning System that uses AR.
The Google I/O keynote contained a number of interesting new and improved technologies, such as new features for the Google Assistant. One of the reveals during the first day keynote involved a new augmented reality (AR) navigation tool for Google Maps.
During the first day keynote, Aparna Chennapragada spoke about some new features for Google Maps, where she discussed how the requirements that users have for Google Maps has changed, and much more is needed.
To help provide for these changing needs, the Google Maps team have worked to integrate Google Maps with the smartphone camera. To illustrate how this would work, Chennapragada talked about an example taken from real life; Imagine exiting from a train or subway station and being on your way to an appointment. Google Maps say to go South on High Street, but how do you know which way is South, and if it is an unfamiliar location, how do you know which one is High Street? This is where the Camera and AR integration comes in.
Instead of a top-down map, users will be able to see the street in front of them through the camera, with an AR overlay arrow pointing the direction and distance. The map view is just below, so users can double check that the two match up properly. The Google Maps team have even been experimenting with an animated guide character that you can follow, such as the animated fox shown briefly in the demo.
In addition, the Maps and Camera integration can be used to show users what shops, landmarks, hotels and restaurants are nearby, by tagging the information from Maps to the correct building, making it easier for users to find a location they are searching for.
In order to make this possible, GPS alone lacks the precision needed, so Google have been working on implementing a new system, referred to as VPS, or the Visual Positioning System. This can estimate a more precise position and orientation. VPS uses the visual features of the environment to provide a precise location.
Further news from the Google I/O events will continue to be reported on here on VRFocus.