Google is adding a powerful addition to its already-powerful Google Search platform. The company will begin surfacing search results in augmented reality, it announced at its I/O developer conference.

The best way to understand the world is to see it visually, according to CEO Sundar Pichai. He added that the idea is to present visual information directly into search by letting users use their camera.

Google Search Gets The Augmented Reality Treatment

To power this, Google is combining computer vision and augmented reality to turn the phone into a more dynamic search tool. For example: Say a person is searching for a scientific concept, such as how planets revolve around their own axis. Rather than just show text, Google will deliver an actual model and demonstrate how that works. The user can see the model in real time in the real world since it's in AR.

But the feature will also work beyond just education. Users have an opportunity to become more informed shoppers since they'll be able to see, say, a pair of shoes in all its angles via AR. Of course, this will only work with supported brands, such as New Balance.

As CNET reports, 3D AR objects will begin surfacing later this year and that developers will be able to add support for their own objects just by adding "a few lines of code." Involved brands include NASA, Samsung, Target, Volvo, and, as mentioned, New Balance, but more are expected to join Google's AR search bandwagon down the line.

Google has been toying with AR for years, having launched its ARCore platform in 2017. Most of the results have so far been, well, playful and not really that functional, examples of which include Playmoji. This new move could kick start the company's transition toward building better uses for AR. Google has also began testing turn-by-turn augmented reality directions on Google Maps, a featured announced at last year's I/O keynote. Meanwhile, apps such as Mayfair allow people to view furniture in AR.

Google Lens

Google Lens is also getting an upgrade. This tool, which makes use of artificial intelligence and machine learning to recognize things such as objects and text, is getting even smarter. Now, if a user points their camera at a restaurant menu, Google Lens will highlight the most popular dishes at that joint. It will also translate foreign signages in real time.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion