I've had the Samsung Galaxy XR on my face for three months now, and here's what keeps nagging at me: this thing feels less like a new headset and more like a platform declaration. At $1,799, it's roughly half what Apple charges for the Vision Pro, and while it still has that first-gen roughness around the edges, the software story underneath it is what has me paying attention. Google isn't just shipping a headset operating system. They're running the Android playbook all over again, this time for spatial computing.
And after watching their MWC 2026 demo, where a pair of prototype glasses pulled up turn-by-turn directions just by looking at a poster of Camp Nou, I'm starting to think the playbook might actually work.
Five devices, one platform, one year
The raw numbers tell the story better than any pitch deck. At least five Android XR devices are expected to ship in 2026: the Galaxy XR headset that's already out, flat-AR display glasses from both Samsung and Xreal (via their joint Project Aura with Google and Qualcomm), and non-display AI glasses from Warby Parker and Gentle Monster. That's headsets, tethered AR glasses, and fashion-forward AI frames all running the same underlying platform.
Hugo Swart, Google's Android XR Ecosystem Lead, laid this out at the AR/VR/MR 2026 conference. He stated that the same Android XR software will work across the full range, from camera-passthrough headsets to wired XR glasses to wireless glasses to AI-only frames with no display at all. He called the Galaxy XR "the powerhouse of our ecosystem" and talked about two distinct types of AI glasses Google is developing: ones with cameras, mics, and speakers only, and display-equipped models with smaller FOV waveguides for what he called "glanceable content."
This multi-form-factor approach is what separates Android XR from everything else on the market right now. Meta's Horizon OS runs on Quest headsets. Apple's visionOS runs on the Vision Pro. Android XR is designed to scale across hardware categories the way Android scaled across phone manufacturers.
What Google showed at MWC actually made sense
I've seen a lot of smart glasses demos over the years. Most of them are solutions looking for problems. The Android XR prototypes Google showed at MWC 2026 felt different, and I think it's because they focused on friction points people actually experience.
The standout demo: you look at a photo of Barcelona's Camp Nou stadium, ask Gemini to "navigate here," and turn-by-turn directions appear in the lens, white text in the center with a familiar Maps-style route visualization when you glance down. No phone out of pocket. No fumbling with screens while walking. The glasses recognized what you were looking at and responded with contextually useful information.
The hardware itself resembles Ray-Ban Wayfarers. A full-color waveguide display sits in the right lens, keeping information in your peripheral vision without blocking your view. Touch the right temple to activate Gemini Live, tap the bridge to pause it. Hands-on reports say you'd learn the controls in minutes.
These are tethered prototypes, pairing with an Android phone for processing power. That's a deliberate architectural choice, not a limitation. It means a smaller battery, lighter weight, and access to your phone's data and AI capabilities without needing another standalone device. Google learned from the Google Glass era that cramming everything into the glasses themselves was the wrong approach.
The developer ecosystem gap (and why it matters most)
Here's where I have to be honest about the state of things. Platform ambition is great, but platforms live or die on developer ecosystems. And right now, Meta has the lead.
Meta sold nearly 2 million Ray-Ban smart glasses and built a real installed base. They're providing a Wearable Device Access Toolkit for outside developers. At GDC 2026, Meta showcased new tools for speeding up builds and optimizing store discovery on Horizon OS. React Native now officially supports Meta Quest, which lowers the barrier for web developers jumping into XR. Meta's app store has years of content and a developer community that knows how to ship and monetize.
Android XR is starting behind on content. The Galaxy XR can run standard Android mobile apps, which gives it a massive library out of the gate, but "phone apps on a headset" has never been a compelling experience. What the platform needs are native spatial apps built specifically for XR, and that takes time, developer tools, and a reason for developers to show up.
Google's bet is that the same network effects that built Android's phone ecosystem will repeat here: more hardware partners means more users, more users means more developers, more developers means better apps. According to AR Insider's analysis, Android XR's open model lets manufacturers focus on hardware innovation while Google handles the OS, which could unlock scale across the sector. Samsung's Galaxy XR is projected to sell over 100,000 units in 2026, and with Xreal, Warby Parker, and Gentle Monster all shipping Android XR devices, the installed base could grow faster than any single manufacturer could achieve alone.
Where the competition stands
The three-way OS race is now official. According to THE ELEC's analysis, the AR glasses operating system competition will narrow to Google (Android XR), Meta (Horizon OS), and Apple (visionOS), with the outcome shaped by developer platforms, AI integration, and spatial computing engines.
Meta is in a peculiar position. They built a successful product and then delayed their next big move. Their full AR glasses, codenamed Phoenix, are reportedly pushed to 2027. That leaves a window. And here's the irony: Horizon OS is itself built on Android Open Source Project. Meta built its XR ecosystem on Google's mobile foundation, and now Google is coming to compete directly in XR with a more open version of that same foundation.
Apple is doing Apple things with Vision Pro, keeping everything locked down and premium. The $3,499 price point puts it in a different market segment entirely.
On the hardware side, Samsung's upcoming smart glasses will run Qualcomm's AR1 platform, purpose-built for all-day wearable use rather than repurposed headset silicon. The AR1+ Gen 1 variant reportedly achieves a 28% size reduction while enabling on-device AI processing. Samsung's EVP of Mobile Experiences Seong Cho described the initiative as being in the "execution phase," targeting rich multimodal AI experiences.
What needs to improve
I want this to work. I've been rooting for spatial computing since I strapped on a DK1 and nearly fell off my chair. But wanting it to work and pretending it already does are different things.
The Galaxy XR still doesn't include controllers in the box, which Android Central's review flagged as a real barrier to recommendation. The XR app ecosystem is thin compared to Quest. Google's AI glasses are prototypes, not products, and the tethered design means you need a compatible phone. Privacy questions around always-on cameras and AI-powered visual recognition are real and largely unanswered. Google's MWC demo included an AI feature that could digitally alter how people appear in real time, which raises obvious consent issues nobody has solved.
And the Android smartphone analogy has limits. Smartphones already had a proven use case (making phone calls) when Android arrived. Spatial computing is still establishing what daily use actually looks like for most people. The "why should I wear this" question hasn't been answered for glasses the way it was answered for phones.
What's next
The second half of 2026 is when this gets real. Samsung's smart glasses are expected to ship. Project Aura, the Xreal collaboration with Google and Qualcomm, has developer kits reportedly coming soon. At MWC, the Aura prototype showed off a 70-degree field of view using Sony's micro-OLED display and Xreal's optical engine.
Google's strategy is clear: own the platform layer, let partners fight over hardware, and use Gemini AI as the differentiator that makes the whole thing useful. It's the Android model, updated for a world where the interface isn't a touchscreen but the space around you.
Whether that strategy works depends on whether developers build for it. Right now, Meta's ecosystem is more mature, Apple's hardware is more polished, and Google's platform is the most ambitious. Ambitious doesn't always win. But I've watched enough platform wars to know that the open ecosystem usually catches up, and when it does, it tends to pull away. Android XR has the right architecture for scale. What it needs now is time, patience, and apps worth wearing a headset or glasses for.
Ren Wilder covers mixed reality for The Daily Vibe.



