The release of Meta’s latest smart glasses at Meta Connect 2025 happened as these things always do: lots of hype, some cool demonstrations (and a few major glitches), and the unmistakable sense that our future just shifted a few degrees closer to immersive experiences. CEO Mark Zuckerberg introduced three new smart eyewear products at Meta Connect 2025. Collectively, this represents the most significant advancement in wearable technology since the smartphone.
Those of us who started working in AR/VR at the beginning of the consumer VR revolution over a decade ago have been waiting for this. And Meta delivered with much more than some incremental upgrades. We still haven’t reached the Promised Land of wearable technology, and none of the new devices are actually AR glasses, as they are only doing projection overlays. But Meta Connect 2025 gave us a glimpse of how we will interact with digital in the future.
So let’s unpack the latest developments and see if Meta’s eye-watering $100 billion in AR and VR investments over the past decade are ready to pay off.
The Ray-Ban Evolution: Gen 2
The big news is the release of Meta’s second-generation glasses in partnership with Ray-Ban. Meta is pushing these as AI Glasses, as they are not making use of augmented reality.
But they are able to deliver on what the first generation of the device had promised but couldn’t fully execute. Eight hours of battery life during what Meta calls “typical use” transforms the glasses from novelty to utility. The accompanying charging case provides an additional 48 hours, effectively eliminating the anxiety that led to the demise of Google Glass and more recent efforts to create smart eyewear.
The 12-megapixel camera now captures 3K Ultra HD video at 60fps with HDR support, while 32GB of storage ensures you won’t run out of space during extended use. The quality of the video is good, though no one is going to mistake it for a video shot on a smartphone. The real advancement lies in what Meta is calling “contextual awareness.” This fall’s software updates will enable hyperlapse and slow-motion capture, turning everyday moments into cinema-quality memories. As we know from the adoption of smartphones, camera quality can be a make-or-break feature.
At $379, Meta’s Gen 2 glasses strike a balance between affordability and professional-grade functionality. This pricing strategy suggests Meta understands that widespread consumer adoption is critical.
Athletics Meet Augmented Reality: Oakley Meta Vanguard
A highlight of Meta Connect 2025 is the release of the Oakley Meta Vanguard. Rather than creating one technology device for everyone, Meta has designed a new model of AR glasses specifically for athletes, adventure seekers, and professionals who work in demanding environments.
Key to the new wrap-around design is the positioning of the 12-megapixel camera at the frame’s center, accommodating helmets and headwear that would otherwise obstruct traditional placement. The 122-degree wide-angle lens captures more of the action, while adjustable video stabilization ensures usability during high-movement activities.
The engineering details matter here: the battery operates effectively across wider temperature ranges, the speakers deliver higher volume for noisy environments, and integrations with Strava and Garmin create seamless workflow connections for serious athletes.
At $499 (available for preorder with October delivery), the Vanguard commands a premium, but one that we suspect cyclists, runners, skiers, and others will be willing to pay for. This isn’t mass-market technology; it’s professional equipment disguised as stylish eyewear.
The AR Breakthrough: Ray-Ban Display
The Meta Ray-Ban Display represents the company’s first close-to-augmented-reality offering, and the implications extend far beyond the product itself.
The translucent heads-up display projects texts, AI responses, navigation, and video calls directly onto your field of vision. The accompanying EMG wristband enables interface interaction and text input without touching any visible controls. Despite a spectacular demo failure (the video calling failed during the live event), Zuckerberg successfully demonstrated music playback, real-time subtitles, and image capture/review.
Here is a short eleven-minute video of the features and a review of its capabilities.
The real-time subtitle feature deserves particular attention. For individuals with hearing impairments, this technology could provide unprecedented accessibility in daily interactions. The translation feature has tremendous potential (we’re coming for you, Star Trek Communicator), though the slight delay makes it much more awkward than using Google Translate on your phone.
The distribution strategy reveals Meta’s understanding of the complexity in selling glasses that are even close to AR glasses. The Display will only be available through brick-and-mortar stores (Best Buy, LensCrafters, Ray-Ban, Verizon), where customers can be properly fitted for the wristband component. AR demands hardware integration and customized fitting – you’re not going to be purchasing the Display online and having it shipped to your home or office.
The glasses will launch on September 30 in the US, and in early 2026 for Canada, France, Italy, and the UK. At $799 for a pair, the Display commands smartphone-style pricing. You won’t see these replacing our ubiquitous phones for now, but it was Meta’s most striking glimpse of what the future holds for wearables.
Meta Connect 2025: The Horizon TV Pivot
Meta’s announcement of Horizon TV for Quest headsets initially seems like a minor addition. Easy access to Disney+, Prime Video, and other streaming services in virtual reality appears straightforward. But it appears that Meta is positioning VR as an entertainment hub rather than primarily a gaming platform.
This strategy acknowledges that mass adoption requires compelling use cases beyond gaming and professional applications. Virtual cinema experiences could provide immersive entertainment that justifies headset ownership for mainstream consumers.
Implications for the Extended Reality Landscape
These product launches at Meta Connect 2025 collectively signal several essential shifts in the XR industry:
- Integration Over Isolation: Rather than creating separate digital worlds, Meta is embedding digital functionality into physical experiences. The AR display doesn’t transport you elsewhere; it enhances where you already are.
- Specialization Strategy: The Vanguard demonstrates that successful AR/VR products may require market segmentation rather than one-size-fits-all approaches. We all use essentially the same phones, but with smart eyewear, different users will have fundamentally different needs. Serious runners don’t run in their work or leisure clothes, and they’ll likely gravitate toward specialized AR devices.
- Accessibility as Innovation: The real-time subtitle functionality illustrates how advanced technology can address genuine human needs rather than creating solutions searching for problems.
- Retail Investments: The requirement for physical store fittings suggests that sophisticated AR devices will require sophisticated support systems. Meta’s not about to match Apple’s 500-plus retail stores around the world, and sales are limited enough that they can work through partners like LensCrafters and Ray-Ban. But we wouldn’t be surprised to see Meta stores in the future.
Profound Ethical Implications
Smart eyewear raises fundamental questions about privacy, consent, and social interaction that Meta Connect 2025 hardly addressed. When someone wearing Ray-Ban Meta glasses can record video without an obvious indication, how do we maintain reasonable expectations of privacy in public or semi-public spaces? Remember the sharp reactions to Google Glass with signs outside of bars and venues banning them? And with Glass, you at least knew if someone was recording.
The new EMG wristband for the Display – in development for years – introduces even more complications. Generally, the digital devices we carry with us already know our location. But they don’t have access to our physiological data (beyond basics such as heart rate, blood oxygen levels, and respiratory rate that an Apple Watch can track). If a wrist device can interpret nerve signals for text input, what other biological data might it access? Will advertisers get access to this as we walk down a street with our AR glasses? How do we ensure user agency and privacy when technology becomes this intimate?
These aren’t merely technical challenges. There are questions about the kind of society we’re building and whether we’re doing it consciously. Of course, given that many of us are already sharing every detail of our lives with AI platforms like GPT-5, Perplexity, Claude, and Gemini, this may not be a major issue for the general public. We won’t see everyone wearing Meta’s AR glasses this year, but after watching Meta Connect 2025, you get the sense that the future is almost at our doorstep. And there are barely any discussions of the ethical issues we need to address.
It remains to be seen if we will develop the social, legal, and ethical frameworks necessary to manage this transformation responsibly.
More Challenges for Education
How will education integrate Meta’s new AR devices into its programs and institutions? We’re already deeply challenged by the arrival and widespread use of Generative AI, and now we see another tech revolution at our doorstep.
Consider, for just a moment, the use of smartphones in K12 schools. Many school systems are now moving to ban phones, and a 2025 National Center for Education Statistics (NCES) report found that 77% of all public schools had policies prohibiting cell phone use in class, with the rest allowing it or leaving it up to individual teachers. What will we do when students start showing up with glasses that include photo/video recording and AR capabilities? You won’t be able to simply ban them if they are also prescription devices.
Looking Forward from Meta Connect 2025
Meta Connect 2025 marks an inflection point. The technology demonstrated this week moves us from speculative fiction toward practical implementation – much more so than last year’s Meta Connect. The question isn’t whether AR/VR will reshape daily life, but how quickly the new devices will be adopted.
The real test won’t be seen in Meta’s demos. It will be in coffee shops, classrooms, and conference rooms where these devices either enhance human interaction or create new barriers between us.
The future, as always, remains unwritten. But the tools for writing it have just become significantly more sophisticated.
Emory Craig is a writer, speaker, and consultant specializing in virtual reality (VR) and generative AI. With a rich background in art, new media, and higher education, he is a sought-after speaker at international conferences. Emory shares unique insights on innovation and collaborates with universities, nonprofits, businesses, and international organizations to develop transformative initiatives in XR, GenAI, and digital ethics. Passionate about harnessing the potential of cutting-edge technologies, he explores the ethical ramifications of blending the real with the virtual, sparking meaningful conversations about the future of human experience in an increasingly interconnected world.