Ray-Ban Meta Glasses Now Feature Real-Time Video Queries and Enhanced AI Capabilities
Users can now ask the Ray-Ban Meta Glasses to identify landmarks or famous spots in real-time while exploring new locations.
Meta Platforms unveiled significant updates to its Ray-Ban Meta Glasses during its Connect event, adding new features aimed at enhancing user interaction with the glasses in real-time. One of the standout updates is the ability for users to make video queries, allowing them to ask questions about landmarks or places they are seeing. This real-time query function is part of Meta’s efforts to integrate more advanced AI capabilities into everyday wearable tech.
Enhanced Ray-Ban Meta Glasses with AI
In a blog post, Meta announced the introduction of new commands and easier interaction with Meta AI. The glasses now respond to a “Hey Meta” wake-up command, streamlining the process of talking to the AI assistant. Additionally, users no longer have to use the “look and” command to identify objects or ask questions. This makes interacting with the glasses more seamless, whether you’re exploring a city or having a general Q&A with Meta AI.
New AI Capabilities for Real-Time Exploration
One of the most exciting updates for users is the ability to query landmarks or famous locations in real time. This feature allows wearers to explore their surroundings more interactively, asking the glasses to identify landmarks or notable locations as they see them. This enhances the experience for travelers or anyone looking to explore new environments with immediate information at their fingertips.
Reminders and Memory Functionality
Meta is also adding a memory function to the Ray-Ban Meta Glasses, enabling users to set voice-prompted reminders for various tasks. For instance, users can ask the glasses to remember where they parked their car, and the glasses will retain that information. This could prove useful in busy environments like airports or shopping malls, providing added convenience.
Integration with Meta AI
Meta AI’s integration into the Ray-Ban glasses isn’t limited to real-time queries. Users can now hold more natural conversations with the AI assistant, including follow-up questions without having to repeat the wake-up command. This makes the interaction more fluid and conversational, further reducing the need for repetitive commands and enhancing the overall user experience.
Expanded Voice Commands
The glasses now support additional voice commands, such as setting reminders or conducting searches. These voice-controlled features further enhance the hands-free experience for users, who can rely on the glasses for more than just queries about their surroundings. It expands the utility of the glasses into more practical, everyday applications.
With these updates, Meta is positioning its Ray-Ban Meta Glasses as a more integrated tool for everyday use, combining style with smart functionality. These features are likely to appeal to users seeking a more interactive and convenient way to explore the world around them.