Loading News Article...
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
We're loading the full news article for you. This includes the article content, images, author information, and related articles.
Meta's new Ray-Ban Display combines a heads-up display with a wrist-mounted Neural Band controller, offering users AI-powered notifications, translations and hands-free control for $799.
Menlo Park, United States – At its annual Connect conference, Meta introduced the Ray-Ban Display, a next-generation pair of smart glasses featuring a built-in heads-up display and a Neural Band wrist controller. The device will launch on September 30 at a price of $799, offering consumers a wearable AI-powered screen directly integrated into the right lens.
Heads-Up Display (HUD): Shows app notifications, translations, and navigation directions directly in the user’s field of vision.
Neural Band: A wrist-mounted device using electromyography (EMG) to detect subtle hand movements for seamless app navigation.
Battery Life: Up to 18 hours of power, enabling all-day usage for active consumers.
CEO Mark Zuckerberg emphasized that the device reflects Meta’s vision of practical wearable AI and its long-term ambition to reduce reliance on Apple and Google smartphones.
The Ray-Ban Display builds on the success of Meta’s first-generation Ray-Ban smart glasses, which integrated cameras, speakers, and Meta’s AI assistant and sold millions of units globally.
New capabilities include:
Real-time AI translations on the lens.
Hands-free content creation with photo and video capture.
Direct navigation through Meta apps like Instagram, WhatsApp, and Facebook.
This positions the device as part of Meta’s broader push toward ambient computing—where digital tools blend into everyday life.
Industry analysts see the launch as:
A hardware expansion that could increase Meta’s control over user data beyond smartphones and PCs.
A move to differentiate from Apple’s Vision Pro and Google’s AR ambitions by focusing on lightweight, socially acceptable wearables instead of bulky headsets.
A potential new platform for AI experiences, especially in gesture-based computing.
Meta hopes that combining Ray-Ban brand appeal with AI capabilities will make wearables mainstream.
Despite the excitement, several hurdles remain:
Privacy Concerns: Always-on cameras and data collection could spark regulatory scrutiny.
Battery Performance: While 18 hours is promising, real-world usage tests will determine durability.
Consumer Demand: The $799 price point may limit adoption unless compelling use cases emerge quickly.
Meta faces the task of proving that gesture-controlled wearable AI solves real-world problems, rather than being a niche novelty.
If successful, the Ray-Ban Display and Neural Band could:
Expand Meta’s ecosystem beyond social media into consumer hardware.
Accelerate adoption of AI-assisted interfaces for everyday tasks.
Intensify competition with Apple, Google, and Microsoft in the emerging spatial computing and AR markets.
Launch Date: September 30, 2025.
Retail Strategy: Direct-to-consumer via Meta’s website and select tech retailers.
Developer Tools: Meta hinted at future SDK releases for third-party AR and AI applications.