Last month at the I/O developer conference, Google revealed some of their upcoming technologies in the world of search, and one that stood out to us is their plan to allow users to search through their smartphone camera…
‘Google lens’ will aim to use advances in both AI technology and computer vision that will enable us to search around things we actually see. No only will it be able to detect what we are seeing, it will be able to offer actions based on that knowledge.
Google CEO Sundar Pichai went on to give some interesting examples of how they envisage Google Lens working:
A basic example would be pointing your smartphone camera at an object, say a flower, and Google using its computer vision technology to be able to identify and tell us exactly what it is.
To go one step further, Google Lens would then use AI to determine how they can use this information and help us further. To use the flower example again, it would then be able to tell us nearby florists in area where we might be able to purchase said flowers.
Google Lens would also have the ability to provide us with information about venues based on a picture. For example, take a picture of a restaurant and immediately get information such as opening hours, reviews, anything else that Google has on that particular venue. This could be an important development for local businesses, and reinforces the need for a presence on Google – both business listings and organic search.
One of the most impressive features of Google Lens would be its ability to understand which action we might want to take based purely on the photo. We’ve all been there – taking a photo of the back of our router to get the wifi password. Well imagine Google instantly assuming that you’ll want to connect to that wifi network based on the image, and then offering this option? Impressive stuff that really goes to show how advanced Google is becoming.
Lens will also integrate nicely with Google Home (powered by Google Assistant). Photos can be added into the ‘conversation’, for example snapping a picture of a gig poster and telling Assistant to add the date to your diary, or asking it to translate a foreign sign.
WILL IT AFFECT THE SEO WORLD?
It’s hard to say at this stage, but as mentioned it could be a great opportunity for local businesses to make sure their Google business listings are up to scratch. This would mean encouraging reviews, adding thorough business information, including images etc so that it can provide users with as much knowledge as possible.
We can also assume that with Google advancing in the world of image recognition, image optimisation will become even more important – think alt attributes, file names, image sitemaps etc. E-commerce sites especially could benefit from Google Lens – if we see a product / item of clothing we like the look of, Lens should be able to tell us where the item is from, how much, and essentially direct us to the right place to purchase..
As usual, Google has been suitably vague about when the technology will become available to the public, but as more is revealed about this exciting new development we’ll be sure to keep you in the loop. What we do know if that Google is remaining well and truly ahead of the search curve and we can’t wait to try it out!
See Sundar Pichai’s full presentation here: