Google at its I/O 2022 consumer keynote on Wednesday announced that it was updating its search engine with improvements to visual searches. The company revealed that it will expand its multisearch feature to allow users to see local results. Meanwhile, Google also stated that it is working on a new scene exploration feature to get insights about multiple objects by panning a camera to see information and insights on their screen. The feature will rely on the multisearch feature, and the company is yet to reveal when it will be available to users.
At Google I/O 2022, the company revealed that it would further expand its multisearch feature with Google Lens that allows users to search with images and text at the same time. Users will be able to search for results “near me” to find options for a local retailer or a restaurant, based on the photo you have clicked and your search term. Google says local information in multisearch will be available to all users globally later this year in English, while support for other languages will be added in the future.
Introduced last month, multisearch is a feature touted by Google as one of its “most significant upgrades” to the search engine in several years. Multisearch allows users to click an image of an object or product — such as a dress or home décor — then swipe up to add text for a ‘combined’ search query. Users can click an image of an orange dress, then add the query ‘green’ or ‘blue’ to search for similar products in another colour, or click an image of a house plant and add a query for ‘care instructions’.
In order to find local results using multisearch, the company says that it scans millions of images and reviews posted on web pages and from the Maps contributors’ community to find results from nearby locations. The feature — which relies on machine learning — can be used to find out where you can find a particular dish at a restaurant near you, or to find a product at a local retailer, according to Google.
The company is also working on expanding the multisearch feature on Google Lens with a new feature called ‘scene exploration’. Google says that users will be able to use multisearch to pan their camera to see information about “multiple objects in a wider scene.” The feature could allow scanning an entire shelf of products, while shopping, to see insights in an overlay on their screen.
Google says that it plans to bring scene exploration to multisearch in the future, but has not revealed which regions will have access to the feature or which languages will be supported. “Scene exploration is a powerful breakthrough in our devices’ ability to understand the world the way we do – so you can easily find what you’re looking for,” said Prabhakar Raghavan, Senior Vice President at Google.
In another announcement related to its search product, Google announced on Wednesday that it will add the ability to request removal of phone numbers, home addresses, and email addresses via the Google app in the coming months. The company announced last month that it was expanding its removal policies related to personal information in search results to all users.