Google’s recent I/O presentation was somewhat of a showstopper. The company pulled out the big guns for the event, revealing its plans to launch a Pixel Watch and Pixel Buds Pro amongst other things. However, perhaps one of the most exciting things to be detailed during the presentations is how Google plans to innovate the way in which people are using their phone’s camera.
During the presentation, Google relayed how they plan to turn cell phone cameras into search engines. It’s actually quite a novel idea. Google is calling its innovation scene explorer. In theory, with scene explorer, one will be able to use their phone to scan a shelf full of products and have the phone’s camera find the exact product they are looking for.
Project lead Prabhakar Raghavan explained that scene explorer is an extension of the Google lens app. The lens app, like the scene explorer, is a visual search tool. With the existing lens app, a Google user can upload a photo, text, or QR code and elicit a search query from it. Where the scene explorer differs from this existing technology, however, is that it will be able to perform visual searches in real-time. “Scene explorer is a powerful ability in our devices’ ability to understand the world the way we do, to see relevant information overlaid in the context of the world all around us,” Raghavan said.
What’s more, is that Google’s scene explorer will not only be able to perform basic searches in real-time, but it will be able to execute exceedingly targeted searches that could serve to save someone a lot of time. For instance, if a person wants to find snacks made in nut-free facilities due to an allergy, scene explorer will be able to identify the exact products that fit that parameter in a fraction of the time it would a take a person to carefully read every product label. “This is like having a supercharged Control-F [find shortcut] for the world all around you,” Raghavan explained.
Raghavan went on to further detail some of scene explorer’s intended functionality. He highlighted that a person will be able to combine visuals and text in real-time to find products and even cuisines. For instance, he gave the example of a plate of food. If someone sees a plate of food say in a magazine or on TV they can have scene explorer use that visual and combine it with the words “near me.” The thought is that scene explorer will then be able to not only identify the cuisine in question but also find nearby restaurants that serve that very dish.
Overall, Google’s new search innovation previews an exciting time for the company. In the future perhaps scene explorer could even be integrated into Google’s recently announced upcoming AR glasses so that individuals can leverage its search functionality without ever having to lift a finger. Ultimately, scene explorer’s full potential is insurmountable and it will certainly be exciting to see the direction that this revolutionary idea takes in the years to come. The world is teetering on a new technological precipice and Google has decided to grab its portion of the reins of run with it.