Al TED2025 by Vancouver, Google He revealed for the first time live Android XRits new platform for the extended realitybringing it directly to a couple of Smart glasses in the prototype phase. To lead the presentation on the stage was Shahram Izadiat the head of Google’s AR/XR division, flanked by Niseha Bhatiain a demo that has left everyone speechless.

Android XR: the Android ecosystem enters the world of extensive reality

Android XR It is the new great evolution of the operating system Androiddesigned to bring digital experience beyond the smartphone screen. This time Google focuses everything on wearable devices such as smart glasses and augmented and virtual reality viewers, those that allow you to see digital information overlapping the real world, directly in front of your eyes.

Just like Android has adapted over time to new devices, from telephones to smartwatches with Wear OS, to TV, to cars with Android Auto, is now ready to jump into the world of extended reality (XR). This means being able to interact with apps, content and services in an even more direct, simple and natural way, without even having to bring out the phone from the pocket.

The real novelty is the integration with Geminithe new generation IA assistant developed by Google. Thanks to this artificial intelligence, Android XR becomes an even more smart and proactive system, capable of interpreting what is observed, listening to vocal requests and providing real -time answers. During the demonstrations, the system showed the ability to translate signs instantly, recognize objects in the environment, identify specific elements in a room and provide useful contextual information during daily activities such as movement, study or work.

In a nutshell, Android XR was born to make technology even closer to us, always just a look, and always ready to help, without the need to touch a screen. It is a taste of what could be our future, a world in which digital mixes with reality, without interruption.

During the demo, Android XR He showed a series of surprising features. Glasses’ lenses, which also support the visual correctionhave displayed the notes of the speaker in real time. But the real protagonist was Gemini, the Google’s assistant, natively integrated into the platform. Among the skills shown:

  • Instantaneous creation of content on request;
  • Advanced visual recognition;
  • Simultaneous translation of signals into different languages, with automatic transmission of the same according to the conversation;
  • Recognition of objects;
  • Immersive navigation with 3D indications and maps in real time.

Many of these functions derive from Project AstraGoogle’s Multimodale initiative that allows the device to “see”, understand and interact with the surrounding world in a contextual way.

Android XR: the concrete vision of a new way of experiencing technology

That view on the stage of the TED2025 It was not only a technical demonstration, but a real tasting of how the daily relationship with technology could change. With Android XRthe digital does not remain closed inside the screen: it comes out, overlaps the real world and becomes part of what you live every day.

Artificial intelligence, thanks to Gemini, no longer limits itself to responding to commands, but observes, understands the context and acts immediately. It gives information, translates, recognizes objects, accompanies in daily activities without the need to touch anything.

In the complete video we also move on to a demonstration with a visor xrbased on Project Moohan of Samsung, expected by the end of the year. Here Android XR shows all its potential even in virtual environments. Everything suggests that Android XR is not only a technological novelty, but the first concrete step towards a new way of experiencing technology.

https://www.youtube.com/watch?v=gelclxpg4j0