Wednesday, March 18, 2026

Trending

Related Posts

Spotify to launch its services to smart glasses

In a series of back-to-back announcements on March 17-18, 2026, Spotify has signaled its intent to become the dominant audio layer for the wearable revolution. From AI-driven “visual music discovery” on Meta’s smart glasses to a new dedicated interface for XR (Extended Reality) eyewear, the streaming giant is moving beyond the smartphone screen and directly into the user’s line of sight.

Meta Ray-Ban: “Soundtrack Your World”

In partnership with Meta, Spotify has launched its first “multimodal AI music experience” via the v21 software update for Ray-Ban Meta and Oakley Meta glasses.

  • Visual Search for Music: By using the glasses’ built-in cameras and Meta AI, users can now say, “Hey Meta, play a song to match this view.” The AI analyzes the scene—whether it’s a sunset, a workout, or a festive party—and generates a custom Spotify playlist tailored to the visual context and the user’s personal taste.
  • Conversation Focus: While not a Spotify-exclusive feature, the update includes “Conversation Focus,” which isolates and amplifies the person you’re speaking to, allowing your Spotify music to play in the background without drowning out human interaction.
  • Regional Rollout: The feature is currently available in English across 19 countries, including India, the US, UK, and Canada.

The XR “Beta” Leak: Now Playing & Lyrics

While the Meta partnership is audio-first, a recent APK teardown of the Spotify Android beta (reported by Android Authority and TechRadar on March 17, 2026) has revealed a new visual layer specifically designed for XR glasses (like those coming from Google and Samsung).

  • Glanceable Displays: The code points to a “Now Playing” tile that floats in your line of sight, allowing you to see track details without reaching for your phone.
  • In-View Lyrics: A dedicated “Lyrics” screen is in development, which would sync with the music in real-time on the glasses’ display—essentially turning your commute or workout into a “solo karaoke” session.
  • Phone-Tethered Architecture: Following Google’s Android XR guidelines, the app doesn’t run fully on the glasses. Instead, the phone handles the processing and “projects” the interface to the eyewear to save battery and reduce heat.

Comparison of Smart Glasses Support

FeatureMeta Ray-Ban (Current)XR Glasses (In Beta)
InterfaceVoice & Gesture OnlyVisual Overlay (HUD)
Visual InputAI “sees” view to play musicDisplays lyrics & track info
NavigationBasic (Play/Pause/Skip)Glanceable Song Details
HardwareRay-Ban/Oakley MetaGoogle, Samsung, Xreal

Why This Matters: The Battle for “Context”

Spotify is moving early to ensure it isn’t sidelined by hardware manufacturers’ own music services. By integrating visual context (on Meta) and glanceable data (on XR), Spotify is attempting to make its algorithm feel less like a “search bar” and more like an “environmental layer.”

As these wearables transition from accessories to daily necessities, being the “default” soundtrack for your vision is a key part of Spotify’s 2026 growth strategy—particularly as it competes with Apple, which is expected to deepen its own Music integration for the Vision Pro and rumored “Apple Glass.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles