Today, we will hear the latest news from the Meta Connect 2024 developer conference. Previous years may have focused on the metaverse (remember that concept?), but this year will go far beyond virtual reality headsets and dreams of an all-encompassing metaverse. The two-day event will definitely have news for those working in virtual reality, but expect to hear news on Meta’s growing investments in both AI and augmented reality (AR). We’ve long argued that immersive tech and AI would need to converge to reap the full promise of XR technologies. And with Meta investing heavily in both, they are well-positioned to bring it about.
Meta’s CEO, Mark Zuckerberg, will also highlight the latest hardware in the Quest lineup and likely new developments in their AR glasses.
How to Watch Meta Connect 2024
As in prior years, the Meta Connect keynote will stream live on the official Meta Connect website. You can also watch it on Meta’s Horizon Worlds if you are a VR headset user, though, from our experience, a flat screen is best. If you want to go into the details, the workshops for developers will be on the Facebook developer site.
Here are the primary times for the conference, with the opening keynote by Meta CEO Mark Zuckerberg, which will begin shortly after the online stream starts at 10:00 am PT.
- US West Coast: 10:00 Pacific Time
- US East Coast: 13:00 Eastern Time
- UK: 18:00 BST
- Central Europe: 19:00 CEST
- India (New Delhi): 22:30 IST
- China (Beijing): 01:00 CST (26 September)
- Japan (Tokyo): 02:00 JST (26 September)
- South Korea (Seoul): 02:00 KST (26 September)
- Australia (Sydney): 03:00 AEDT (26 September)
A Quick Update on Meta’s Announcements
As expected, we saw new headsets, improved AR glasses, lots of news on AI, and a prototype of Meta’s forthcoming holographic headset. Here’s a quick recap and we’ll explore the developments in depth later on.
The Meta Quest 3S – A More Affordable VR Headset
CEO Mark Zuckerberg began with a reveal of the Quest 3S, a new VR headset starting at $299.99, with preorders available immediately and an official release on October 15. A higher-storage version will also be offered for $399.99, positioning it significantly below the $499.99 Quest 3 and far below Apple’s $3,500 Vision Pro headset. With the rapid pace of immersive tech developments, the Quest 3S maintains the same capabilities and app compatibility as the Quest 3. The key differences lie in storage capacity and display, and it will be a compelling option for VR users and organizations seeking a more affordable entry point.
Meta Ray-Bans – Smarter Smart Glasses
Meta also introduced new features for its Ray-Ban smart glasses, including a remarkable demo of live translation (travel will never be the same), the ability to control apps like Spotify directly through the glasses, and a very useful option to record information like phone numbers or parking spots. One gets the sense that our GPS-based apps will recede into the background when this feature becomes popular. There will also be a limited edition transparent design, with Zuckerberg pointing out the irony of first trying to conceal the tech and now offering a version of the device that shows it off.
Orion – Advanced AR Glasses
A key highlight of Meta Connect 2024 was the unveiling of Orion, prototype AR glasses with a holographic interface. Described by Zuckerberg as “the most advanced glasses the world has ever seen,” Orion overlays holograms onto reality without pass-through, reducing latency. The glasses use voice, eye, and hand tracking as the main control features. Strikingly, Meta will also release its wrist-based neural interface (the result of a nearly ten-year-long project) for the device, letting it respond to finger movements to display messages, video calls, and more. Orion is still in development and won’t be available to consumers, though Zuckerberg hinted that there will be limited developer access.
Meta AI Will Be Everywhere
As we predicted, the convergence between AI and XR tech is accelerating, especially with the integration into the Ray-Ban smart glasses. Zuckerberg showcased new AI features for the smart glasses, including live translation and Meta’s AI assistant with celebrity-voiced options like John Cena, Judi Dench, Awkwafina, Keegan-Michael Key, and Kristen Bell. Enhanced AI photo editing and automatic language translation in Reels (with lip-syncing) were also highlighted. Additionally, Meta unveiled its latest AI model, Llama 3.2, enabling developers to create new AI tools.
The head-on competition with OpenAI was evident as Zuckerberg highlighted Meta’s AI assistant’s popularity with 500 million active users. He expects it to become the most-used AI assistant by the end of 2024. Given that it is free and that OpenAI is limiting its advanced feature to Plus subscribers, we’re not about to challenge his prediction.
AI Developments and its Role in Immersive Technologies
Along with Google, Microsoft, and others, Meta has significantly ramped up its AI capabilities over the past year, integrating generative AI tools across its platforms like Instagram and Facebook. These AI advancements, including chatbot assistants and its Llama large language model, are poised to influence not just digital advertising but the way AI interacts with human creativity and knowledge-sharing. However, the real interest at Meta Connect 2024 will be in how AI is getting integrated with Meta’s AR and VR hardware.
Augmented Reality: A New Layer for Everyday Life
Though much of the early hype around Meta focused on fully immersive VR, we expect this year’s Connect conference to place greater emphasis on augmented reality—specifically, Meta’s Ray-Ban smart glasses. AR devices are finally getting to the small form factor that users will be ready to embrace, though it may be another year before Meta fully resolves the hardware challenges. Generative AI will play a significant role in creating a device that is immediately responsive to its environment.
The potential for AR glasses in business, education, and the arts is profound. With the ability to overlay virtual and interactive content onto real-world environments, they will transform the ways we work, learn, and entertain ourselves. For artists and creatives, AR offers a new dimension of expression. The technology enables interactive exhibitions, public art installations, and live performances that react to the environment or even the audience. It invites the public to engage with art in new ways, making it more accessible and participatory.
A Turning Point for the Metaverse Vision?
Meta’s challenge lies in balancing these technological advancements with practical adoption rates. While groundbreaking as a concept, the metaverse has yet to see mainstream adoption. With AI becoming embedded in AR and VR applications, Meta’s future doesn’t just rest on VR headsets or digital worlds. The next steps in Meta’s journey, particularly around AI and AR, could define how society uses technology not only to communicate but to think, learn, and create.
Join us for Meta Connect 2024 as we follow the latest developments in AR and VR and their growing convergence with generative AI.
Emory Craig is a writer, speaker, and consultant specializing in virtual reality (VR) and generative AI. With a rich background in art, new media, and higher education, he is a sought-after speaker at international conferences. Emory shares unique insights on innovation and collaborates with universities, nonprofits, businesses, and international organizations to develop transformative initiatives in XR, GenAI, and digital ethics. Passionate about harnessing the potential of cutting-edge technologies, he explores the ethical ramifications of blending the real with the virtual, sparking meaningful conversations about the future of human experience in an increasingly interconnected world.