I tried Google’s prototype smart glasses and it almost made me forget about my phone - CNN
I tried Google’s prototype smart glasses—here’s what stood out and why they might keep your phone in your pocket more often.
The big idea
Google is betting that AI-powered glasses are the next computing platform, shifting quick-look tasks—directions, calls, object info, translations—off your phone and into your line of sight. The glasses run on Android XR with Gemini AI and are slated for launch next year.
What they can do now
- Hands-free navigation: An arrow appears near your sightline, with a glance-down map for context—no constant phone checks.
- Real‑time answers: Ask Gemini about what you’re looking at (e.g., “Are these peppers spicy?” or “Do I need to read earlier books in this series?”).
- Calls, photos, and translations: Interactions happen by voice and subtle glance cues.
- On‑device creativity: A single voice command transformed a room photo into a North Pole scene via Google’s Nano Banana AI model—impressive, but raises image-authenticity concerns.
What’s changed since earlier demos
- Faster visual understanding and image editing.
- More polished navigation cues.
- Expanded Gemini integrations for look-and-ask moments.
Privacy and social acceptance
- A visible indicator light shows when the camera or AI image editing is active.
- Users can delete prompts and activity in the Gemini app.
- Google says social acceptance is mission‑critical after lessons from Google Glass; discretion, transparency, and clear cues are built in.
Will they replace your phone?
Not likely—at least not soon. In testing, overlapping voice cues led to awkward stops and starts (talking before Gemini was listening, interrupting responses). Glasses excel at quick, contextual tasks but don’t yet match the broad, private utility of phones.
Models, compatibility, and partners
- Two initial versions: one with a display, one audio‑only.
- iPhone compatibility alongside Android support.
- Fashion partners: Warby Parker and Gentle Monster; pricing and launch dates still unannounced.
- A more advanced dual‑screen model is in development for richer graphics.
Ecosystem strategy
- Android XR is a platform for other makers, not just Google hardware.
- Early partners include Samsung and Xreal, signaling a broader wave of AI glasses and headsets.
Competitive landscape
- Meta’s Ray‑Ban glasses have seen strong demand, showing appetite for stylish, useful wearables.
- The challenge remains turning prototypes into everyday products without the pitfalls that sunk earlier smart glasses and many VR headsets.
Bottom line
Google’s Gemini‑powered glasses feel like a meaningful step toward ambient, glanceable computing. They outperform phones at certain tasks—navigation, translations, and object queries—while highlighting ongoing hurdles around social cues, privacy, and polish. If Google nails acceptance and utility, smart glasses could become the next mainstream interface layered over the world around you.
Source: https://edition.cnn.com/2025/12/08/tech/google-glasses-ai-gemini
Back…