The Ray-Ban Meta smart glasses are set to receive significant upgrades thanks to improvements in the social network’s AI assistant. The company is now enabling real-time information access for the onboard assistant and testing new “multimodal” capabilities for answering questions based on the user’s environment.
Previously, the Meta AI had a knowledge cutoff of December 2022, meaning it couldn’t provide information on current events, game scores, traffic conditions, and other real-time queries. However, Meta’s CTO Andrew Bosworth announced that all Meta smart glasses in the United States will now have access to real-time information, partly powered by Bing.
Furthermore, Meta is testing the “multimodal AI” capabilities of its assistant, allowing it to answer contextual questions about the user’s surroundings and provide information based on what the user is looking at through the glasses.
These updates aim to make Meta AI feel more useful and less gimmicky, addressing previous criticisms of the smart glasses. However, widespread access to the new multimodal functionality is expected to be limited initially, with expanded access potentially coming in 2024.
Videos of the new capabilities have been shared, demonstrating how users can engage with the feature using voice commands. For example, users can ask Meta AI to look at an object and provide information or suggestions, such as asking for outfit recommendations based on a specific item of clothing.
In a post on Threads, Bosworth also mentioned that users will be able to ask Meta AI about their immediate surroundings and request creative tasks, such as writing captions for photos they have taken.