Forward-loving: The race to define the future of the wearing technology is heating up, in which smart glasses are emerging as the next major border. While Meta’s Ray-Bain collaboration has already created waves, technical giants like Apple, Samsung and Google are fast developing their projects. The latest development comes from Google, which recently gave the public the most solid look to the public on Android XR-operated smart glasses during a live performance at the recent TED2025 conference.
So far, Google’s Android XR glasses appeared only in the teaser video and hands-on preview shared with select publications. These early glimpses indicated the ability to integrate artificial intelligence into everyday eyes, but left questions about the performance of the real world. When Google’s Android XR lead Shahram Izadi took the TED stage – to join Nishtha Bhatia – it changed to display prototype glasses in action.
Live Demo displayed many characteristics that distinguish these glasses from previous smart eyewear efforts. At first glance, the device resembles a simple pair of glasses. However, it is packed with advanced technology, including a miniature camera, microphone, speaker, and a high-resolution color display, embedded directly in the lens.
The glasses are designed to be light and rational, with the support of the leaflet lens. They can also connect to the smartphone to take advantage of their processing power and reach a wide range of applications.
Izadi launched a demo using glasses to display his speaker notes on the stage, depicting the case of a practical, everyday use. However, the actual highlight was the integration of Gemini AI auxiliary of Google. In a series of live interactions, Bhatia demonstrated how Gemini can generate a haiku on demand, can be glimpseed in a few moments remembering the title of a book, and can detect the key card of a wrong hotel-all through simple voice commands and real-time visual processing.
But the capabilities of glasses expand well beyond these parlor tricks. There was also an on-fly translation in the demo: a sign was translated from English to Persian, then switched to Hindi originally when Bhatia addressed Gemini in that language-a change in a manual setting.
Other features displayed include visual explanation of diagrams, relevant object recognition-as to identify a music album and offer a song to play a song-and the head-up navigation with a 3D map overlay was introduced in the field of direct wear directly.
Unveiled last December, the Android XR platform – developed in collaboration with Samsung and Qualcomm – is designed as an open, integrated operating system for extended reality devices. It brings familiar Google apps into the immersive environment: on the virtual Big Screen on YouTube and Google TV, Google Photos in 3D, Emarsiv Google Maps, and Chrome with many floating windows. Users can interact with apps through hand gestures, voice commands and visual cue. The platform is also compatible with the existing Android apps, which ensures a strong ecosystem from the beginning.
Meanwhile, Samsung is preparing to launch its own smart glasses, codenum Hen later this year. The inferior glasses are reportedly designed for comfort and subtlety, regularly similar to sunglasses and incorporates gestures based on cameras and sensors.
While the final specifications are still being selected, the glasses are expected to facilitate integrated cameras, a light frame and possibly Qualcomm’s Snapdragon XR2 Plus General 2 chip. Additional facilities under consideration include video recording, music playback and voice calling.