Android’s personal search experience, Google Now is about to get a whole lot smarter with the unveiling of Now on Tap.
Google’s Sundar Pichai dropped the term ‘machine learning’ on more than one occasion during today’s opening keynote at Google I/O in San Francisco and when we were shown Now on Tap, we glimpsed the tip of the iceberg with regards to what’s possible when the data at Google’s fingertips is put to good use.
Now on Tap offers a more proactive Google Now experience, supplying answers to questions based on the context of the user, being able to understand more complicated queries and more.
Aparna Chennapragada, who heads up Google Now, broke down the key components that govern both Google Now’s, and in turn Now on Tap’s success. To be able to help the user effectively Now needs to be able to understand the context of the user and the queries being put to it. Providing answers in a pro-active manner is also a fundamental part of the experience, such as daily weather or your favourite team’s current sports score.
Next is context – Now apparently has access to detailed information on over 100 million places, and we’re not just talking location information here; think opening times, when do they get busy, what sort of things might the user be looking for when they’re there and so on. The data Google has access to which answers these sorts of questions, even when you’re not asking, is contained within what the company calls its knowledge graph.
The third and most important component of this new Now experience is with regards to the answers it offers up. The pool of ‘entities’, as Aparna called them, is in the region of one billion and these can be anything from places to dinner recipes.
The demo we saw on stage at I/O featured an Android device running Spotify, playing a track by artist, Skrillex. The user was able to ask his device, “OK, Google. What’s his real name?”, rather than stating, ‘OK, Google. What’s Skrillex’s real name?’
Now on Tap is intelligent enough to understand context; it picked up that the user was referring to the artist he was listening to at the time, it can recommend petrol stations when it knows how far you have to drive and even offers up relevant information based on messages; like the location of your local dry cleaners when your other half forgot to pick it up on the way home.
To feed Now on Tap the context sensitive data it needs to remain vigilant to a user’s requirements, Google has launched a pilot service with over 100 partners and that number will undoubtedly increase once Android M takes its final form come this autumn. It’s an impressive side note that NoT doesn’t required additional work for app developers to function, it can simply understand what the user is interacting with.
We’re intrigued to see what else this new extension of Google Now is capable of, even if it does have the ability to scan apps you’re using at the time for information, which is sort of creepy when you think about it.