What you need to know
- Google announced that it has opened its “trusted testers” waitlist for interested participants for Project Astra, its AR venture.
- The company detailed that users can engage with Project Astra via mobile (Android) and “prototype glasses.”
- It seems there will be an experimental Astra app that lets users open their camera or share their screen for quick information on what they see.
Google’s “vision for the future of AR” is entering an exciting phase today (Dec 11) as it progresses its development.
Per Google’s DeepMind page for Project Astra, the company announced that the latest AR venture has opened its “trusted tester” waitlist. The company states those signing up today (Dec 11) can explore the “future capabilities” of its “universal AI assistant.” Signing up will require users to fill out a brief form involving their name, location, age, and Android phone type.
Additionally, Google will ask if you are interested in “using prototype glasses.”
When it comes to your phone, it seems this early test will see accepted participants downloading a Project Astra Android app. A short demo video shows that users can open the app and activate the camera within it to point at something they’re interested in. Google states Astra is “multilingual,” meaning users can speak to the AI model in their preferred language and receive information back.
The demo shows a model asking about a string of Chinese lanterns (in French). Astra responds both vocally and in written text with (hopefully) relevant and useful information about your query.
Moreover, Google adds that users can “share their screen” with Astra to gain even more assistance.
The AI model is said to leverage Google apps like Maps, Lens, and Search to fulfill a user’s query. This is reportedly a part of its “enhanced understanding.” A subsequent post shows that Astra is capable of “contextual understanding” for “clear, detailed answers, and it can explain the thinking and context behind” its answers. Moreover, Google has added a memory bank to Astra, meaning the AI should remember key details from your previous encounters.
This seems like something similar to “Saved Info” in Gemini, which Google rolled out in mid-November.
The other side of this sign-up is Google’s mention of prototype AR glasses. Its announcement splash page only highlights that Astra can “integrate seamlessly with prototype glasses to see the world as you see it – for an even more immersive and helpful experience.”
We’ve been expecting AR glasses from Google ever since I/O 2024 when it showcased Project Astra. The demo showed how interactive and engaged Project Astra is (likely) designed to be. When wearing a pair of AR glasses, the device will “record” your surroundings in real time, so when you ask a question, the AI can answer it.
It’s worth mentioning that Project Astra is an AI routine that’s built off Gemini to give it that “real person-like” experience during conversations. Google also showed that Astra can remember the location of objects, like where you last set down your AR glasses.
Watch On
Discussion about this post