Google
(Photo : unsplash/ Christian Wiediger) Google

One of the most common dilemmas of Google users is looking for items that they want online but can't describe. Most of the time, Google users don't even know the exact name of the items they are looking for.

Luckily, Google has found a way to solve this problem, so people can now easily look for things they want.

Google Lens Multisearch Feature

In an exclusive interview with The Verge, Google Search's product manager Belinda Zeng revealed how the new Google feature works.

On Apr. 7, Google launched a US-only beta of the Google Lens Multisearch feature that it teased in September 2021 at the Google Search On event. 

While the public has only seen a rough demo of the feature, users won't have to wait long to try it themselves because it will roll out in the Google app on iOS and Android, according to Engadget. 

While it is mostly aimed at shopping to start, as it was one of the most common requests, Zeng and Google Search's director, Lou Wang, suggested it could do much more.

Also Read: 'How to Buy Squid Game Token' Searches on Google Still Trends-Binance Investigates Crypto Scam

Wang said that users could fix a broken item in front of them even if they do not have the words to describe it, as they can type "how to fix."

In fact, it might already work with some broken bicycles, according to Zeng. She told The Verge that she also learned about styling nails by screenshotting pictures of nails on Instagram, then typing the word "tutorial" to get the type of video results that were not automatically coming up on social media.

You may also be able to take a picture of a plant or an animal, and you can get complete instructions on how to take care of them.

Wang said that they want to help people understand the questions naturally. He explained how multi-search would expand to more videos, images, and even the types of answers that you may find in a traditional Google text search.

How the Feature Works

The intent of the feature is to put everyone on even footing. Instead of partnering with specific shops or even limiting video results to Google-owned YouTube, he said that the feature would show results from any platform they could index from the web.

But the feature won't work with everything, though, just like how the Google voice assistant does not work with everything. This is because there are infinite possible requests, and Google is still figuring out the intent.

For example, if you use the feature to look for a certain brand of notebook, you need to make sure the feature sees the image of the notebook clearly. If not, it will just assume that you want more notebooks.

Google is hoping AI models can launch a new era of search, and there are a lot of open questions about whether context can take it there.

According to Gizmodo, Google has not revealed if it plans to roll out the feature worldwide, but it may expand the feature to other users if the US beta version does well.

In 2021, TechTimes shared how users can get good search results on Google.

Related Article: Google Search Hacks: How to Use Quotation Marks, Hyphens, and More to Find Things on Google

This article is owned by Tech Times

Written by Sophie Webster

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion