Tuesday, March 24, 2026

Trending

Related Posts

Meta bring Gemini-like voice commands to Ray-Ban and Oakley glasses

Meta has officially begun rolling out its v23 software update, which introduces a “Gemini-like” conversational experience to its Ray-Ban Meta and Oakley Meta HSTN smart glasses.

The update, which first hit the “Early Access” program on March 18, 2026, transforms the way users interact with Meta AI, moving from rigid, one-off commands to a fluid, continuous dialogue system similar to Googleโ€™s Gemini Live or OpenAIโ€™s Advanced Voice Mode.


1. Natural Conversations (The “Gemini” Experience)

The standout feature of v23 is the shift toward “more natural” conversations. This upgrade removes several friction points from the hands-free experience:

  • Continuous Listening: You no longer need to say “Hey Meta” before every follow-up question. Once a conversation is started, the glasses remain active to hear your next thought.
  • Interruptibility: Just like Gemini Live, you can now interrupt Meta AI while it is speaking. If the AI is giving too much detail or you want to pivot to a new topic, you can simply speak over it to redirect the conversation.
  • Contextual Memory: The AI can now better maintain context across multiple turns, allowing for complex multi-step tasks like planning a workout or troubleshooting a recipe while you’re actually doing it.

2. Expansion into Oakley & Snow Sports

The update also marks a major push into the “active” segment, formally integrating the Oakley Meta HSTN into the high-end AI feature set.

  • Snow Sports Tracking: Meta has partnered with Garmin to bring real-time stats to the glasses. When skiing or snowboarding, users can ask: “Hey Meta, what was my max speed on that last run?” or “How many vertical feet have I descended today?”
  • Automatic Capture: The AI can now “autocapture” key moments during high-intensity sports, such as a steep descent or a jump, by detecting “significant moments” via the connected Garmin sensors.
  • Resort Intelligence: Users can get hands-free updates on lift statuses, trail grooming, and weather conditions for major resorts directly through their lenses.

3. Real-Time Translation: Early Access Expansion

Meta is aggressively expanding its Live Translation capabilities, which provide real-time audio translation directly through the glasses’ open-ear speakers.

FeatureNew Support (v23 Update)
New LanguagesHindi, Arabic, Russian, Swedish, and Finnish
Download-FreeNo need to download language packs; translations happen in the cloud.
AvailabilityUS and Canada (Early Access Program).

The “Neural Wristband” & Future Hardware

While v23 enhances existing frames, it serves as a bridge to the Meta Ray-Ban Display (codenamed “Hypernova”), which is expected to ship later this year.

  1. In-Lens Display: The upcoming model will feature a private heads-up display (HUD).
  2. Neural Wristband: Instead of voice-only controls, users will use a surface electromyography (sEMG) wristband to navigate menus and “type” messages using subtle finger movementsโ€”all while the AI listens and responds in the background.
  3. “Super Sensing”: Leaked reports suggest 2026 models will support “always-on” AI that can run for hours, allowing the glasses to remind you of things you’ve seen (like where you left your keys) throughout the day.

“By making the AI interruptible and removing the ‘Wake Word’ requirement for follow-ups, Meta is trying to make its glasses feel less like a gadget and more like a second brain,” noted one reviewer from Tom’s Guide.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles