Meta Ray-Ban AI Glasses (marketed as Ray-Ban Meta) are a line of smart glasses developed through a partnership between Meta Platforms and EssilorLuxottica, the parent company of Ray-Ban. First launched in September 2023 as the second generation of Meta's smart eyewear collaboration, the Ray-Ban Meta collection integrates an ultra-wide camera, open-ear speakers, and the Meta AI assistant into frames that look and feel like ordinary Ray-Ban sunglasses. The product line has been a breakout commercial success, driving a 210% year-over-year surge in global smart glasses shipments in 2024 and selling over 7 million units by the end of 2025 [1][2].
Meta's collaboration with EssilorLuxottica dates back to 2019, when Facebook (Meta's predecessor) began exploring consumer smart glasses with the Italian-French eyewear giant. The first product of this partnership, Ray-Ban Stories, launched in 2021. Those glasses allowed users to capture photos, play music, and make phone calls, but received mixed reviews due to limited functionality and privacy concerns surrounding the always-available camera [3].
In September 2023, Meta and EssilorLuxottica unveiled the second-generation product under the rebranded name Ray-Ban Meta. The new glasses represented a significant leap in hardware and software capabilities. In 2024, the two companies extended their partnership for an additional ten years, with periodic reviews covering pricing, production volumes, and feature development [4].
The partnership has not been without tension. Meta, eager to build scale and dominate the smart glasses category, has pushed for lower retail prices to drive adoption. EssilorLuxottica, rooted in the luxury eyewear segment, has been more cautious about price cuts that could undermine margins. Despite these internal debates, both sides have described the relationship as constructive and not at risk [5].
The Ray-Ban Meta glasses are built on four classic Ray-Ban silhouettes (Wayfarer, Headliner, Skyler, and Round) and are available in over 150 frame and lens combinations. From the outside, they are nearly indistinguishable from standard Ray-Ban frames, which has been a key factor in consumer adoption.
| Feature | Specification |
|---|---|
| Processor | Qualcomm Snapdragon AR1 Gen 1 |
| Camera | 12 MP ultra-wide (one per lens corner) |
| Video recording | Up to 3K resolution, 3-minute clips |
| Microphones | Five-microphone array |
| Speakers | Open-ear directional speakers (five-speaker setup in Gen 2) |
| Connectivity | Bluetooth 5.3, Wi-Fi |
| Battery life | Up to 5 hours music playback, 5.4 hours voice calls (Gen 2); up to 8 hours mixed use with optimizations |
| Charging | Custom charging case (provides additional 48 hours of standby) |
| Weight | Approximately 49 grams |
| Water resistance | IPX4 (splash-proof) |
| Starting price | $299 USD |
The Gen 2 model (released late 2024) brought a 42% increase in battery capacity, double the bass output, higher maximum speaker volume, and improved directional audio that reduces sound leakage to nearby people. The Qualcomm Snapdragon AR1 Gen 1 chipset represents a significant upgrade from the Qualcomm Snapdragon wearable chip used in the first generation, providing more computational headroom for on-device AI processing, improved power efficiency, and better camera performance [6][16].
On June 20, 2025, Meta announced an expansion of the smart glasses lineup through a partnership with Oakley, another EssilorLuxottica subsidiary. The Oakley Meta collection targets sports and performance lifestyles. The first model, the Oakley Meta HSTN, launched on August 26, 2025, followed by the Oakley Meta Vanguard, announced at Meta Connect on September 18, 2025. Both models share the same core technology as the Ray-Ban Meta glasses (cameras, speakers, Meta AI integration) but feature Oakley's sport-oriented frame designs [17][18].
The defining feature of the Ray-Ban Meta glasses is their deep integration with Meta AI, the company's conversational AI assistant powered by the LLaMA family of large language models.
Users activate the assistant by saying "Hey Meta," after which they can issue voice commands to make calls, send messages, get directions, set reminders, ask general knowledge questions, and control music playback. The interaction is entirely hands-free, relying on the glasses' microphone array and open-ear speakers.
In April 2024, Meta rolled out a major update that enabled multimodal input via the glasses' cameras. With this capability, users can ask Meta AI to describe what they are looking at, identify landmarks, read and translate signs, suggest recipes based on ingredients on a counter, and more. The system processes visual information from the cameras alongside voice input to provide contextual responses [7].
In December 2024, Meta released the v11 software update, which introduced two additional AI features:
Also introduced in the v11 update, Shazam integration allows users to identify songs playing in their environment by saying "Hey Meta, what is this song?" The feature launched initially in the United States and Canada [8].
Meta has continued to expand AI capabilities through software updates throughout 2025 and into 2026:
| Feature | Description | Release |
|---|---|---|
| Live AI | Continuous camera feed with real-time AI conversation | December 2024 |
| Live Translation | Real-time speech translation (6 languages) | December 2024 |
| Shazam integration | Song identification | December 2024 |
| Object identification | Point-and-ask about items in view | April 2024 |
| Landmark recognition | Identify buildings, monuments, and points of interest | April 2024 |
| Sign translation | Read and translate text in the environment | April 2024 |
| Call captioning | Real-time captions for phone calls displayed on paired phone | 2025 |
| Spotify shortcuts | Personalized music controls below media player | 2025 |
| Hearing enhancement | AI-assisted conversation clarity in noisy environments | December 2025 |
In December 2025, TechCrunch reported that Meta added a hearing enhancement feature to the glasses, allowing users to hear conversations more clearly in noisy environments by leveraging the five-microphone array to isolate and amplify speech [19].
The glasses pair with a companion smartphone app. Originally called Meta View, the app was renamed to the Meta AI app on both iOS and Android in 2025. The app serves as the central hub for managing the glasses, including configuring settings, reviewing captured photos and videos, managing AI features, and controlling privacy preferences [20].
Key app features include:
In 2025, Meta took its first steps toward opening the smart glasses to third-party developers with the launch of the Meta Wearables Device Access Toolkit. This SDK allows developers to build applications that can access sensor data from the glasses, including the microphone and camera, and pipe that data back to companion apps on Android or iOS devices. After processing the sensor data, apps can send information back to the glasses for audio output [21].
The developer toolkit includes:
| Component | Description |
|---|---|
| Pre-built libraries | Streamlined development with ready-made sensor access code |
| Sample applications | Reference implementations for common use cases |
| API documentation | Architecture, endpoints, data structures, and best practices |
| Camera access | Read camera feed for computer vision applications |
| Microphone access | Access audio input for voice processing |
| Audio output | Play back processed results through the glasses' speakers |
The developer preview is available at developer.meta.com/wearables, with broader community publishing of third-party integrations expected to roll out throughout 2026 [21].
The Ray-Ban Meta glasses have been a rare success story in the consumer smart glasses market, a category that has historically struggled to gain mainstream traction.
| Metric | Value |
|---|---|
| Units sold (2023-2024 combined) | ~2 million |
| Units sold (2025) | ~7 million |
| Revenue growth (2025 vs. 2024) | Tripled year-over-year |
| Best-seller status | #1 product in 60% of European Ray-Ban stores |
| Global smart glasses market growth (2024) | 210% YoY, driven primarily by Ray-Ban Meta |
| Global market share (Q2 2025) | ~73% |
| EssilorLuxottica wearables contribution to Q3 2025 revenue | Over 4 percentage points of 11.7% quarterly growth |
EssilorLuxottica reported in February 2026 that it had more than tripled Meta AI glasses sales in 2025 compared to the prior year. The company's shares surged 14% on the news, pushing its market capitalization to a record high near $20 billion [2][9]. According to Counterpoint Research, global smart glasses shipments surged 210% year-over-year in 2024, with Ray-Ban Meta accounting for the vast majority of that growth [1].
In Q3 2025, EssilorLuxottica reported its best quarterly revenue performance since the company's formation in 2018, reaching 6.87 billion euros with an 11.7% increase at constant exchange rates. More than 4 percentage points of that growth came from wearables, which includes the Meta smart glasses products. Meta smart glasses drove more than a third of EssilorLuxottica's overall growth during this period [22][23].
Meta CEO Mark Zuckerberg has stated ambitions to produce over 10 million pairs per year by 2026, signaling that the company views smart glasses as a core hardware platform alongside its Quest VR headsets [10].
The smart glasses market has attracted several major technology companies, but as of early 2026, Ray-Ban Meta holds a dominant position.
| Product | Manufacturer | Display | AI Assistant | Camera | Price | Status |
|---|---|---|---|---|---|---|
| Ray-Ban Meta (Gen 2) | Meta / EssilorLuxottica | No (audio only) | Meta AI (multimodal) | 12 MP | $299+ | Shipping |
| Oakley Meta HSTN / Vanguard | Meta / EssilorLuxottica | No (audio only) | Meta AI | 12 MP | $399+ | Shipping |
| Meta Ray-Ban Display | Meta / EssilorLuxottica | Yes (in-lens) | Meta AI | Yes | $799 | Shipping (Dec 2025) |
| Snap Spectacles (next gen) | Snap Inc. | AR display in lens | AI-enhanced | Yes | TBD | Expected 2026 |
| Google Android XR Glasses | Google / Warby Parker | Prototype stage | Gemini | TBD | TBD | Prototype shown Dec 2025 |
| RayNeo X3 | TCL / RayNeo | Micro-LED display | AI assistant | Yes | ~$799 | Shipping |
| Even Realities G1 | Even Realities | Micro-LED display | ChatGPT integration | No | $499 | Shipping |
Snap CEO Evan Spiegel announced in mid-2025 that a new version of Spectacles with an AI-enhanced interactive display would launch in 2026, ahead of Meta's planned AR-capable "Orion" glasses (expected 2027). Google showed off unnamed Android XR prototype glasses in December 2025 through a partnership with Warby Parker, but has not set a release date [11][12].
The overall smart glasses market is projected to grow from 3.3 million units shipped in 2024 to nearly 13 million by 2026 [11].
Several factors explain Ray-Ban Meta's dominant market position:
At Meta Connect in September 2025, Zuckerberg announced the Meta Ray-Ban Display, the company's first AI glasses with an integrated visual display. A small, full-color, high-resolution screen is built into the right lens, enabling users to view notifications, navigation directions, live captions, translated text, and other information without pulling out a phone [13].
| Feature | Specification |
|---|---|
| Resolution | 600 x 600 pixels |
| Pixel density | 42 pixels per degree |
| Field of view | 20 degrees |
| Brightness | Up to 5,000 nits |
| Battery life | Up to 6 hours mixed use |
| Input methods | Voice, touch, Meta Neural Band (EMG) |
| Price | $799 (includes Meta Neural Band) |
The display glasses ship with the Meta Neural Band, a wristband that uses surface electromyography (sEMG) technology to detect electrical signals from muscle activity. This allows users to control the glasses through subtle hand gestures rather than relying solely on voice or touch inputs. At CES 2026, Meta announced upcoming features including a teleprompter mode and EMG-based handwriting input, allowing users to compose messages and take notes through natural hand movements detected by the Neural Band. Handwriting is initially available in English for WhatsApp and Messenger in the United States [13][24].
Starting in March 2026, Meta Ray-Ban Display users gained the ability to watch, like, save, and share Instagram Reels directly on the glasses, with Instagram Direct messaging available through handwriting for those in the Early Access Program [24].
Smart glasses with cameras have raised ongoing privacy concerns. The Ray-Ban Meta frames include a small LED indicator light that illuminates when the camera is active, intended to signal to nearby people that recording may be taking place. However, critics have noted that the indicator is small and easily missed, and that the glasses' resemblance to ordinary eyewear makes it difficult for bystanders to know whether a camera is present at all [14].
Meta has implemented several safeguards: video recordings are limited to three minutes, the camera cannot be activated silently, and a visible indicator light is mandatory. The company has also published guidelines encouraging responsible use and established partnerships with privacy advocacy groups. Despite these measures, the combination of AI vision capabilities and an always-worn camera continues to generate debate about surveillance and consent in public spaces.
Meta provides privacy controls through the companion app's device settings, where users can:
Several privacy-related controversies have emerged:
As of early 2026, reports indicate that a third generation of the Ray-Ban Meta glasses is in development. Leaks and rumors suggest that the Gen 3 model will feature an upgraded Qualcomm Snapdragon AR1+ chipset, improved multi-modal Meta AI capabilities, and potentially new sensor technologies. Some reports have mentioned "super sensing" capabilities, including the possibility of on-device facial recognition through Live AI, though Meta has not confirmed these features [29].
As of early 2026, the Ray-Ban Meta collection remains the best-selling smart glasses product worldwide, commanding approximately 73% of the global smart glasses market. The Gen 2 hardware with its improved battery, camera, and audio continues to ship, while the Meta Ray-Ban Display with its in-lens screen represents the next step toward full augmented reality glasses. The Oakley Meta collection has expanded the addressable market into sports and active lifestyles.
Meta's long-term roadmap includes Project Orion, a true AR glasses prototype that the company demonstrated internally and to select press in 2024. Orion features a wider field-of-view holographic display and is intended to eventually replace smartphones as a primary computing device. However, Orion is not expected to reach consumers before 2027 at the earliest [15].
The success of the Ray-Ban Meta line has validated Meta's strategy of prioritizing fashionable design and practical AI features over cutting-edge display technology. By partnering with the world's largest eyewear company and building on one of the most recognized frame brands, Meta found a formula that previous smart glasses attempts, from Google Glass to early Snap Spectacles, failed to achieve: a product that people actually want to wear every day.
The developer platform launch in 2025, the Oakley partnership expansion, and the Display model's integrated screen all point toward an increasingly mature ecosystem. With EssilorLuxottica's CEO describing smart glasses as a potential "smartphone successor" and Meta targeting 10 million units per year in production, the Ray-Ban Meta line appears positioned to define the smart glasses category for the foreseeable future [30].