AI And The Future Of AR Glasses

A tech revolution brings changes to the world that are unpredictable.

Our first exposure to AI has been chatbots. Very clever chatbots! They’re getting better all the time at conversing as if they’re human. They rewrite emails and summarize subjects and write computer code. Those are impressive tricks but I get it, you’re not convinced yet that AI deserves all the hype.

Want an example of how AI might power a technical breakthrough?

AI may bring augmented reality to the world.

I recently read an interview with Caitlin Kalinowski, head of AR Glasses Hardware at Meta. If she’s right – and she’s very smart and very experienced – then AI may play a key role in jumpstarting development of augmented reality glasses.

Background

Let’s tick off some of the points that I’ve been making for years.

Virtual reality devices that cover the eyes like Apple Vision Pro and Meta Quest 3 are niche devices. They will never be used by most people and they’re not designed to be. They provide a nice income stream but their important purpose is to familiarize people with VR/AR and partially justify the expense of the technical work required for AR glasses, which are the real goal.

True AR glasses will resemble normal glasses. We will see the real world, just like regular glasses, but with 3D overlays that interact seamlessly with the real world. That will open up a new world of educational, entertainment, and professional applications – a transcript or translation of a conversation, a flag over a business displaying info or coupons, an arrow pointing in the direction of your friend, a game character to interact with, or any of a thousand other things.

When AR glasses are perfected and people understand the possibilities, they will be widely adopted. AR devices will be as impactful as the iPhone was 17 years ago. Don’t believe me? Tim Cook has been saying this for a long time, and he’s a smart guy, right?

    Miniaturizing everything necessary for AR to fit into lightweight eyeglass frames is the most difficult engineering challenge we have ever faced. AR glasses require high-resolution displays; sensors and cameras for location and object recognition; processing power; network connectivity; speakers; and batteries to power it all. After spending billions of dollars, Apple Vision Pro represents the state of the art in 2024 – a pound and a half of electronics on your head, plus a separate battery pack in a pocket.

    The timing for AR glasses is uncertain because they require unpredictable technical breakthroughs.

    The effect of AI on AR development

    Unexpected breakthroughs using AI have already helped Meta’s AR glasses team design smaller cameras and extend battery life.

    Android Central got an exclusive interview with Caitlin Kalinowski, who has two decades of product design experience at Apple and Meta. She has worked on every Oculus VR hardware product, including the Oculus Rift, Oculus Go, and Oculus Quest headsets.

    Now she’s the hardware leader of Project Nazare, Meta’s AR glasses, which will give a clear view of the real world along with digital images. Kalinowski explains that “customers are going to be able to see both the original photons of the real world in addition to what overlay you effectively want to have.”

    The team has been galvanized by the rapid advances in AI. She calls out two specific ways that AI is helping solve the difficult hardware problems. From the Android Central interview:

    “The excitement in Kalinowski’s voice is unmistakable. The company has clearly made a breakthrough in several important areas and is, undoubtedly, closer to fruition than possibly any company before it has been.

    “Some of the major breakthroughs have been because of the recent advancements in AI, but it’s not just the generative AI like ChatGPT that you might think of when you hear the term.

    The company has been able to shrink its camera sensors because of AI, which can denoise imagery in real-time. That frees up more space for things like processors and batteries.

    “Additionally, Kalinowski said that AI can now be used to further enhance SLAM — short for Simultaneous Location and Mapping. AI can be used to compress important data so that it can be processed more quickly, lowering the power requirements of the glasses.” (emphasis added)

    And there’s more to come.

    “Maybe more excitingly, she admitted that ‘we don’t don’t even know what they will all be yet’ when referring to all the ways AI can be used to virtually increase the processing power and capabilities of low-power devices like AR glasses.

    “‘We have looked at what’s happening in AI and intentionally looked at our roadmap and made changes to take advantage of this AI revolution.’ In other words, how we thought AR glasses would work is completely different now that modern AI systems exist. ‘The exciting thing is AI is changing every week,’ she adds.”

    I’m focused here on how AI is unexpectedly affecting the hardware development of AR glasses because that is such a huge stumbling block.

    But that’s only the beginning of AI’s effect on AR. AI will also be the backbone of AR applications, enabling real-time analysis, interpretation, and interaction with the environment. AI will power object recognition, identifying and tracking objects in the real world; tracking of your movements and gestures; and personalization, analyzing data about you to personalize your experiences.

    The AI breakthroughs may be part of the reason that Mark Zuckerberg permitted himself to be photographed surrounded by AR glasses. Meta may be confident enough to preview its glasses this year and deliver them within 2-3 years.

    Like the song says, The future is so bright that we may need to wear shades.