These Sleek, AI-Powered Smart Glasses Look Ready To Blow Away The Competition
The nimble glasses above are called Halo, and they're smart AI glasses that feature a built-in projector. That's the general idea behind a wide variety of products that are already available in stores (like Meta's new Oakleys) or coming soon (Google and Samsung's Android XR glasses). Not of all these products offer both AR and AI capabilities, however.
But what would you say if you heard that the Halo AI glasses come with AI capabilities unseen in existing and upcoming models? The glasses run a full open-source AI stack that supports a private, multimodal voice-based assistant called Noa; a "Narrative" memory system that lets the AI remember everything about you via mathematical representation and remains private at all times; and a "Vibe Mode" voice-based coding mode that lets you create custom apps you might need on the fly.
Add a 14-hour battery, on-device processing, and a $299 price tag, and the Halo AI glasses sound too good to be true. They propose technology we expect to see from all the big tech players in the coming years. Yes, advanced AI smart glasses look like the future, but Apple can't offer similar technology right now. Google's AI smart glasses work, as seen at I/O 2025, but hiccups in performance are expected. Meta's AI glasses can't match the Halo AI privacy claims. What Halo proposes is AI tech that might need a few more years to mature.
Specifications for the Halo AI glasses
Halo comes from Brilliant Labs, a tech startup from a former Apple employee. The company launched the Frame smart glasses in February 2024. Back then, I thought they were ahead of their time as well, proposing AR and personal AI experiences from a product that looked like a regular pair of glasses. Halo is an evolution of that, featuring a "Halo display" module (0.2-inch microOLED screen) that beams color content into the user's retina. The glasses themselves weigh just 40 grams and can be customized with prescription lenses. The frame features the B1 chip from Alif Semiconductor, which Brilliant Labs describes as a "cutting-edge, ultra-efficient AI/ML microcontroller for on-device AI with a dedicated Neural Processing Unit (NPU)."
An AI-optimized optical sensor captures content at low power. The frames also include battery modules that let you use Halo for up to 14 hours. Bone-conductive speakers will let you hear anything the Noa AI assistant says without headphones.
Running the show is the Zephyr Open Source Operating System, with a new Lua programming language directly on top of it. That's the setup that will let you create Halo apps via voice with AI. It's unclear whether the glasses need a connection to your phone. The company's GitHub shows a Noa for iPhone app that pairs with other Brilliant Labs glasses to access ChatGPT.
Major AI software innovations
Brilliant Labs says the Halo glasses come with strong privacy protections. "In the interest of its open-source values and safeguarding user privacy, all rich media required for Noa, including visual and auditory inputs captured by Halo, are immediately converted into an irreversible mathematical representation," the company said in a press release. "No rich media is stored. Furthermore, no third party can view customer data, ensuring that your personal experiences remain yours, and yours alone."
That's important for the Narrative "patent-pending agentic memory system," which the company describes as a "breakthrough innovation in multimodal reasoning and long-term recall." Narrative uses the camera, display, and AI microphone to "remember and reason over your daily first-person POV experiences." That means Halo will capture audio and video of everything you do, so Noa can analyze the context and then build that personalized knowledge base. It's unclear where the data is held, and the company's privacy policy makes no mention of Narrative. The document hasn't been updated since February 2024.
The vibe-coding abilities of Noa are equally impressive, at least on paper. The glasses should use the AI to create any app on the fly: "With Vibe Mode, Noa democratizes app creation by enabling any user, even with no coding experience, to not only create but share new AI applications with users around the world as fast as they can imagine them. Users can even remix existing generated apps and build on the functionality developed collectively across the community, creating a multiplier of creativity never before seen in the smart glasses industry." Vibe Mode will be available via the Brilliant Labs website and the Noa app. This indicates the Halo glasses will need to connect to a phone to work.
Halo release date and price
Whether or not you'll use the AI experiences above, there is one Halo AI feature you'd likely use every day: the Noa assistant. Brilliant Labs describes it as a true conversational agent. "Noa can understand what it hears and sees within its environment and responds with contextually relevant information in real-time." Also important is another privacy claim — that Noa works like a VPN between the user and the AI model, doing all of the heavy lifting to keep your data private by default.
As a user of AI chatbots, I believe the future will bring us personalized experiences similar to what Brilliant Labs teases for the Halo glasses. I'm not convinced the glasses are as good as advertised, but I haven't actually used them. I also want more details about the AI models Halo uses and what happens to my data, whether we're talking about the Narrative capability, vibe-coded apps, or conversations. I'm not ready to trust the AI with seeing and recording everything I do. And again, the privacy policy hasn't been updated in over a year.
The good news is that the Halo AI smart glasses are open-source. Others will be able to see whether Brilliant Labs' claims are accurate. To get started, you'll have to preorder the Halo AI glasses for $299 from the company's website. The glasses will start shipping to buyers in November.