Meta Connect 2025 Unveils Next-Gen AI Glasses, Metaverse Tools and New Entertainment Hub

Meta opened the first day of its 2025 Connect conference with a blitz of product launches and platform upgrades designed to position the company at the forefront of wearable AI, immersive computing and next-generation entertainment. From new Ray-Ban Meta glasses and Oakley performance frames to a powerful creator toolkit and a “Meta Horizon TV” content hub, the announcements underscored Meta’s push to blend everyday life with mixed reality and artificial intelligence.
Ray-Ban Meta Glasses Go Next-Gen
Headlining the keynote was the introduction of second-generation Ray-Ban Meta AI glasses. These updated smart frames feature double the battery life of the first generation, 3K ultra-wide video capture, improved low-light performance and new color options. Meta said the glasses are designed to be worn “from morning until night” without compromise, blurring the line between a stylish accessory and a high-powered wearable computer.
Alongside the classic Ray-Ban styles, Meta unveiled Oakley Meta Vanguard performance frames — the first collaboration between the two brands. Aimed at athletes and active users, the frames combine Oakley’s sport-centric design with Meta’s AI features and voice assistant, making them suitable for everything from training sessions to live streaming on the go.
For early adopters, the company also previewed its first Ray-Ban Meta Display glasses paired with a wrist-worn Neural Band. This experimental setup offers a heads-up digital overlay and allows users to control interfaces via subtle finger movements captured by the band’s sensors — a glimpse of Meta’s long-term vision for natural, hands-free interaction.
AI Everywhere: Smarter, More Personal
Meta is expanding the built-in AI assistant across its glasses and apps, giving users the ability to query information, translate conversations, or receive contextual help without reaching for their phones. The system also supports “co-pilot” functions such as recording and summarising meetings or generating captions for photos and videos in real time.
CEO Mark Zuckerberg said the goal is to make AI “an everyday companion” — not a separate app, but something ambient, embedded into the devices people already wear. In demos, the glasses identified landmarks, translated street signs, and even suggested recipe ideas when a user looked at ingredients on a kitchen counter.
Metaverse Infrastructure Gets an Upgrade
Beyond hardware, Meta announced significant improvements to its virtual world backbone. The Meta Horizon Engine, the technology powering Horizon Worlds and other immersive experiences, now offers developers better graphics, lower latency, and seamless cross-device support.
A new Meta Horizon Studio provides creators with enhanced tools to design, monetise and publish their own experiences. The company highlighted features such as one-click import of 3D assets, integrated analytics, and AI-powered scene generation to speed up world-building. Meta said this was part of its commitment to “democratise the metaverse” by lowering barriers to creation.
Meta Horizon TV: Dolby Atmos Meets Blockbusters
One of the day’s biggest surprises was the announcement of Meta Horizon TV, a virtual entertainment hub inside the Horizon environment. It will feature live sports, concerts and blockbuster movies, many in Dolby Atmos and other premium formats optimised for VR headsets. Meta is partnering with major studios and streaming services to populate the hub, positioning it as a one-stop destination for immersive entertainment.
Early content partners include global sports leagues, music festivals and film studios. Viewers will be able to watch alone or in social “watch parties” with friends’ avatars, bringing a communal feel back to at-home viewing.
Developers and Creators at the Center
Throughout the keynote, Meta emphasised its support for developers and creators as key to building a vibrant ecosystem. The company announced expanded funding for creator programs, revenue-sharing initiatives, and new monetisation tools such as in-world subscriptions and virtual goods storefronts.
Meta also rolled out updates to its AI-powered Creator Studio, allowing influencers to automatically generate highlights, clip reels and personalised content from their live broadcasts. This, the company said, will help small creators compete with larger production teams.
A Glimpse of the Future
The keynote painted a picture of Meta’s ambition: to integrate AI, wearables and immersive experiences so seamlessly that the technology “disappears” into everyday life. Zuckerberg described the Neural Band and display glasses as “the next major computing platform” — a step toward devices you don’t have to hold or even look at directly.
Industry analysts noted that while Meta’s metaverse vision has faced scepticism, the company is steadily iterating on the hardware and software pieces needed to make it practical and appealing. By combining stylish wearables, embedded AI, and a pipeline of content and tools, Meta hopes to broaden its appeal beyond early adopters and gamers to mainstream consumers and professionals.
Looking Ahead
Meta Connect 2025’s Day 1 keynote signalled a shift from concept to execution. The products on display — from the upgraded Ray-Ban Meta glasses to the Horizon TV hub — are tangible, shipping soon or already in pilot programs. They also show Meta doubling down on partnerships, with Ray-Ban, Oakley, Dolby and global content providers all on board.
As the conference continues, developers and analysts will be watching for more details on pricing, release dates, and how Meta plans to address privacy, data security and developer revenue share — all crucial to the success of its ambitious roadmap.
For now, the message from Connect 2025 is clear: Meta isn’t just betting on the metaverse; it’s weaving it into glasses, apps and everyday experiences — one step closer to making immersive computing and AI an invisible part of daily life.