At Google I/O 2017, the tech giant revealed its impressive new Google Lens app. This advanced Augmented Reality tool can recognise and understand whatever you’re looking at and offer any essential information you require. Here’s all you need to know about what Google Lens is capable of.
One of the most intriguing announcements at Google I/O 2017 was the reveal of Google Lens. We’re not massive fans of AR apps, as we’re yet to find one that’s accurate or particularly helpful. However, Google Lens already looks to be the first AR experience that could change our minds.
Google Lens will be built into the Google Assistant AI, which can be used on the latest Android phones as well as iPhones. The idea is that you simply point your camera at your surroundings and the Lens can identify what you’re looking at and use its smarts to work out what you need; be it real-time information or some other service.
Here’s what we know so far about Google Lens, and how it’s set to change the way we use our mobiles.
Impressive object recognition and real-time information
Existing Augmented Reality apps can kind-of recognise some real world objects when you point your phone’s camera at them. This usually offers up simple information on a manufactured product, such as the online price and a basic description.
However, Google Lens goes a clear step beyond this by recognising a much greater range of objects, including natural stuff. For instance, focus on a flower and ask Google Lens what it is and you’ll be told the species and given any information you require, pulled straight from the web. Handy if you have any allergies.
Presumably this feature will work for pretty much any object you can think of. For instance, on your hols you can point your smartphone camera at landmarks, zoo animals and other interesting things and immediately read up on whatever you’re looking at. Been served up a random dish and want to know what you’re about to eat? Hopefully Google Lens will be able to identify and tell you what’s on your plate.
As well as simple Wikipedia descriptions and the rest, Google Lens will also offer up reviews of any shops, restaurants and other services that you aim your camera at. You’ll be able to see the star ratings and reviews right there on your phone screen.
Automatic assistance in Google Assistant
As well as delivering vital info on your surroundings, Google Lens can also do smarter stuff. For instance, aim your phone’s snapper at a sticker featuring a WiFi network ID and password and Lens will identify it as such. You will then be offered the option of connecting to that network automatically. No fiddling around entering those details manually.
Likewise, you can point your camera at a billboard for an upcoming concert as you’re passing a theatre. Google Lens will figure this out and offer up a number of options. You can save the concert in your Google Calendar, hunt for tickets online, play music from the featured band via an online streaming service and so on.
Integration into Google Photos
If you want to get information on photos you’ve already taken, good news. Google Lens will be fully integrated into the Photos app as well, to offer smart info on stuff already in your albums. This includes smart automation too, so for instance you can automatically dial a number on a slip of paper that you took a shot of.
When is Google Lens coming to my phone?
Google hasn’t announced a specific release date for Google Lens just yet, but you can expect it to arrive in a future update to the Google Assistant. This should appear later in 2017, hopefully in the next few months.