Android XR Glasses: A New Era of Wearable Computing – Live Demo at Google I\/O 2025

Android XR Glasses: A New Era of Wearable Computing – Live Demo at Google I\/O 2025

🧑‍🎤 Presenter Script / Voiceover:

“Good [morning/afternoon], everyone!
Welcome back to the future of computing — where digital and physical worlds come together seamlessly. Today, we’re incredibly excited to show you a live demonstration of our next big leap in wearable technology: Android XR Glasses .”

“These sleek, lightweight glasses are powered by a custom version of Android — optimized for extended reality experiences across augmented, virtual, and mixed reality environments.”

🔍 Demo Introduction:

“Let’s jump into the demo. Here’s what you’ll see today:

  • Seamless integration with your Android ecosystem

  • Real-time contextual AR overlays

  • Voice, gesture, and eye-tracking inputs

  • Native apps and third-party integrations built using the new Android XR SDK

  • And yes — it runs on Android, so developers can build for it just like any other device.”

🖥️ Scene 1: Setup & Interface

“First, I’ll power on the glasses. With a simple tap on the frame, they boot up instantly and pair automatically with my Pixel phone via UWB and Bluetooth LE Audio.”

“Once connected, I’m greeted with a spatial launcher — floating widgets, holographic notifications, and a familiar Android-style home view projected into my field of vision.”

“This is Android — reimagined for spatial computing.”

🕹️ Scene 2: Real-World Use Case – Smart Navigation

“Now, let me walk you through a real-world scenario. Imagine I’m walking through a city and need directions to a meeting.”

[Presenter looks around. The glasses overlay an arrow pointing down the street.]

“As I move, the system uses on-device AI and visual positioning to provide turn-by-turn navigation directly in my line of sight — no need to look down at my phone.”

“I can also ask, ‘Hey Google, how far am I from Union Square?’ and get a voice response with a visual overlay.”

📱 Scene 3: Messaging & Communication

“What if I receive a message while walking? Instead of pulling out my phone…”

[Incoming notification appears in front of presenter. They glance at it.]

“I can read the message, reply via dictation, or even answer a video call using the built-in camera and spatial audio.”

“And thanks to advanced beamforming mics, background noise is filtered out — making conversations crystal clear.”

🧠 Scene 4: Developer Experience & App Ecosystem

“Behind the scenes, these glasses run a powerful, customized version of Android 15, with full support for Kotlin, Jetpack Compose, and the brand-new Android XR SDK .”

“Developers can now create immersive apps that respond to head pose, hand gestures, and environmental anchors — all using familiar tools like Android Studio.”

“Let’s take a look at a sample app: a 3D modeling tool that lets me place objects in space and manipulate them with hand gestures.”

[Presenter gestures mid-air, manipulating a 3D model of a house.]

“This is just one example of how developers can bring their creativity to life on this platform.”

🌐 Scene 5: Multi-User & Shared Experiences

“But we didn’t stop there. We’ve also built in support for multi-user shared spaces .”

“So two people wearing Android XR Glasses can collaborate in real time — whether it’s reviewing a design, playing a game, or even sharing a movie in a virtual theater.”

[Demo shows two users looking at the same object in space, interacting with it simultaneously.]

🏁 Closing Remarks:

“This is just the beginning. Android XR Glasses represent a bold step forward in personal computing — blending utility, privacy, and immersion in a form factor that fits into your everyday life.”

“We can’t wait to put this into the hands of developers and users alike later this year.”

“Thank you — and stay tuned for more updates during I/O!”

💡 Tips for Presenters:

  • Use real-time interaction rather than pre-recorded footage.

  • Highlight developer accessibility — this is key for adoption.

  • Show accessibility features , privacy controls , and battery life estimates .

  • Consider a live Q&A after the demo to address audience questions.



d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it d0it