The Ray-Ban Meta smart glasses are set to receive significant upgrades with the integration of AI-powered visual search features. Meta's AI assistant, embedded in the smart glasses, will now offer real-time information, marking a substantial enhancement.

Meta To Open First Physical Retail Store
(Photo : Justin Sullivan/Getty Images)
BURLINGAME, CALIFORNIA - MAY 04: Ray-Ban Smart Glasses are displayed during a media preview at the new Meta Store on May 04, 2022 in Burlingame, California. Meta is set to open its first physical retail store on May 9.

Upgrading Ray-Ban Meta Smart Glasses 

Ray-Ban Meta smart glasses are on the verge of a significant upgrade, courtesy of advancements in the social network's AI assistant. The company is introducing robust support for real-time information to the onboard assistant and initiating tests on innovative "multimodal" capabilities. 

Engadget reported that this enhancement enables the AI assistant to respond to questions based on the user's environment. 

Meta CTO Andrew Bosworth revealed a transformative shift, announcing that all Meta smart glasses in the United States will now access real-time information, partially powered by Bing.

The latest updates have the potential to significantly enhance the practicality of Meta AI, addressing concerns about its perceived gimmicky nature, a point highlighted in my initial review of the otherwise commendable smart glasses. 

Also Read: Ray-Ban Stories New Update Adds 'Spotify Tap' and More

Historically, the Meta AI faced limitations with a "knowledge cutoff" until December 2022, restricting its ability to address queries about current events, game scores, traffic conditions, and other on-the-go essentials. 

In a separate development, WWD reported that Meta is delving into the testing phase of one of its assistant's intriguing capabilities known as "multimodal AI." 

Initially showcased during Connect, these features empower Meta AI to provide contextual answers about the user's surroundings and respond to inquiries based on what the user is observing through the smart glasses.

Early Access for Beta Version

Widespread access to the new multimodal functionality may still be a while away for most smart glasses users. According to Bosworth, the early access beta version will be initially limited to a small number of people who opt-in in the United States, with broader availability expected in 2024.

CNET reported that Mark Zuckerberg has provided a glimpse of the possibilities through a few shared videos showcasing the new capabilities. The clips suggest that users can initiate the feature with commands like "Hey Meta, look and tell me." 

During a recent demonstration, Zuckerberg provided a firsthand glimpse into the expanding capabilities of Meta AI, showcasing its practical applications and versatility. In a notable example, Zuckerberg engaged with the AI by presenting a shirt and requesting it to recommend matching pants. 

This scenario illustrated the potential integration of Meta AI into daily decision-making processes, particularly in matters related to personal style and fashion choices, presenting a practical and user-friendly aspect of the technology. 

Beyond these tangible applications, Bosworth, in a video shared on Threads, provided insights into the broader functionalities of Meta AI. Users can utilize AI to inquire about their immediate surroundings, expanding the scope of its utility to real-time information retrieval and environmental awareness. 

Related Article: Ray-Ban Meta Smart Glasses: Shoot Video, Audio, Photos Directly to Instagram, New Custom Frames

Written by Inno Flores

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion