
The future arrives unevenly.
Meta has sold over 2 million Ray-Ban Meta glasses in less than two years. It has about 70% of the smart glasses market.
For the first time since Google Glass, AI glasses are really selling.
Yet Apple — the company that turned earbuds into a platform — continues to focus heavily on AI headphones. It acquired Israeli startup Q.ai for approximately $2 billion in January 2026. It is developing AirPods with cameras for 2027. It sells 76 million AirPods per year.
This is not a contradiction. It is the result of a much simpler and much tougher constraint:
Physics.

Some limitations are strategic. Others are physical.
The three problems that every AI wearable must solve
To become a mass-market product, an AI wearable must solve three problems.
1️⃣ Be socially acceptable
Glasses are not just technology: they are an accessory, an identity object. Ninety percent of sunglasses are not prescription. They are status symbols, fashion statements.
Google Glass failed partly because of this. It looked like an alien tech gadget. The term “Glasshole” entered the vocabulary. It wasn't something people wanted to wear.
2️⃣ Have truly useful AI
In 2013, AI wasn't ready. “OK Glass, take a picture” gave you a stupid photo. No language model understood what you were looking at.
Today it does: multimodal models that see, hear, translate, and explain the world around you in real time.
3️⃣ Be technically viable
Battery that lasts all day. Weight under 70 grams. Manageable heat on the skin. Enough power to run AI.
This third point is not a design choice. It's a physical constraint.
And that's the point: we're not looking at a technology competition, but a competition over when a technology will become viable.
AI headphones: 3 problems solved out of 3
AI headphones have solved all three.
AirPods Pro 3 ($249, September 2025) are the benchmark.
Design: Normalized for eight years. White stems are a status symbol. FDA certification as a hearing aid has eliminated any stigma about continuous use.
AI: Real-time translation in nine languages (English, French, German, Portuguese, Spanish, plus Japanese, Chinese, Korean, Italian coming in 2026). Natural conversations with Apple Intelligence. Studio-quality audio recording. Sleep tracking. Custom PPG sensor that pulses infrared light 256 times per second for continuous heart monitoring.
Hardware: Eight hours of battery life with active noise cancellation. Plus 30 hours from the charging case. H2 chip handles audio, ANC, and sensor fusion locally. All heavy AI processing is done by the iPhone's Neural Engine, not the headphones themselves.
AirPods are audio transducers, not AI devices.
Google Pixel Buds Pro 2 ($229) follow the same architecture with the custom Tensor A1 chip—the first Tensor in Buds—enabling Gemini Live for natural two-way conversations. Direct integration with Gmail, Calendar, Keep, Maps via voice. Eight hours with ANC, 30 from the case.
Samsung Galaxy Buds3 Pro ($249) bring Galaxy AI Interpreter for real-time translation during in-person conversations and calls. The budget Galaxy Buds3 FE ($149) offers the same translation features at half the price.
This architecture works because audio is lightweight.
The headphones:
Capture audio
Transcribe it locally
Send only text to the phone via Bluetooth
The text weighs kilobytes. The phone does all the heavy AI lifting. The headphones remain efficient audio devices.
Result: zero mandatory cloud, minimal latency, reduced compromises on privacy and battery.
It's an elegant architecture. And it works today, at scale.
Smart glasses: 2 out of 3 problems solved
Smart glasses have made huge strides.
Ray-Ban Meta Gen 2 ($299-$379) has brilliantly solved design and AI.
Design: Classic Wayfarers. 50 grams. Eight hours of typical battery life, plus 48 from the case. Indistinguishable from normal glasses. Meta understood that you had to be Ray-Ban first, smart second. The EssilorLuxottica logo is worth more than Meta's technology.
AI: 12MP ultra-wide camera, 3K video, 5-microphone array, open-ear speaker, 32GB storage. When you say, “Hey Meta, look at this,” the glasses capture an image, send it via smartphone to Meta's servers, where multimodal Llama 3 processes the request. Perceived latency: three seconds. Use cases: identify objects, translate text into six languages, answer contextual questions, hands-free photos/videos, Instagram and Facebook livestreaming.
Ray-Ban Meta Display ($799, September 2025) adds a 600×600 pixel full-color display in the right lens using Lumus waveguide technology. Invisible from the outside, up to 5000 nits brightness at 90Hz. Includes Meta Neural Band — EMG bracelet that reads muscle signals for gesture control (scroll, tap, swipe via micro-finger movements). The display shows messages, turn-by-turn navigation, live captions, Spotify, visual AI responses. 69 grams. Sold out in 48 hours.
Meta has also expanded into sports eyewear with Oakley Meta Vanguard ($499)—IP67, 122° wide-angle camera, Garmin/Strava integration, more powerful speakers, 9-hour battery life. Target: athletes and outdoor enthusiasts.
Other competitors have taken different approaches.
Even Realities G2 ($599) does the opposite on privacy — no camera, no speakers, just dual micro-LED waveguide displays at an impossible 36 grams with 48 hours of battery life. It pairs with the R1 smart ring for control. Focus: notifications, navigation, teleprompter, translation into 31 languages.
Rokid Glasses ($599) tries to do everything — display, 12MP camera, ChatGPT/GPT-5 integration, 89-language translation — at 49 grams using Qualcomm/NXP dual-chip architecture. Budget Rokid AI Glasses Style ($299) copies the Meta screenless formula but with an open AI ecosystem (ChatGPT, DeepSeek, Qwen).
Amazon Echo Frames ($270) are the simplest — no camera, no display, just audio with Alexa+ for voice commands, music, calls, and smart home control. 37-46 grams, 6-14 hours of battery life. Comfortable but functionally limited.
But they all share the same fundamental problem.
The hardware.
Meta has not ignored this limitation.
It has deliberately chosen to build the market before it is resolved.
Why glasses can't ‘do what headphones do’
The lightweight glasses have 154 milliampere-hour batteries divided between two arms.
For comparison: Samsung Galaxy Watch has 425 mAh. Every milliwatt counts.
Camera, wireless radio, AI processing, speakers, microphones — they all compete for the same tiny amount of energy. Video recording is limited to three-minute clips, partly for thermal management. The glasses sit on temperature-sensitive skin and must remain below 40 degrees Celsius.
The Qualcomm Snapdragon AR1 Gen 1 chip is too weak for local AI vision. The latest model, Snapdragon AR1+ Gen 1 (in Display), includes an NPU capable of running Llama 3.2-1B (1 billion parameters) completely on-device for offline tasks, but this is orders of magnitude less capable than cloud models.
So the glasses send everything to the Meta cloud.
Why can't they offload to the phone like headphones?
Bluetooth physics.
The headphones send transcribed text — kilobytes.
The glasses would need to send 12MP real-time video streams.
The numbers are brutal:
Bluetooth 5.3: approximately 2 megabits per second effective
Compressed 1080p video: 5-8 megabits per second
It won't work. Physically.
Even using WiFi Direct for more bandwidth:
Huge battery consumption vs Bluetooth
Variable latency
iPhone Neural Engine not optimised for real-time multimodal video
Result: lightweight glasses are stuck between two impossible choices:
Cloud AI → latency + privacy issues Powerful on-device AI → battery dies in an hour
Snap Spectacles Gen 5 with full AR display: 45 minutes of battery life.
It's not a strategic choice. It's a physical limitation.
Compression can reduce traffic, but it does not eliminate the trade-off: less bandwidth means more latency, more consumption, or less useful information for AI. With current hardware, you cannot optimize all three. You always have to trade something off.
Three categories of wearable AI (today)

Same ambition. Very different constraints.
This constraint clearly divides the market.
AI headphones
Practical, mature, mass market. 420 million TWS units shipped annually. They win because the audio is light and the offload works perfectly.
Lightweight glasses
Fascinating, useful, but technically limited. Projected market: 20 million units and $5.6 billion in 2026, quadrupling from 2025. CAGR growth of 89%. Meta dominates with about 70% market share because it has solved design + AI better than anyone else. But the market remains an early adopter due to physical constraints.
Heavy glasses (Vision Pro tier)
Apple Vision Pro: 450 grams, integrated M2 + R1 chips, external wired battery, 2-2.5 hours of battery life. Powerful AI completely on-device without compromise. Perfect for museums with AR tours, industrial training, assisted surgery, and immersive gaming. But they do not replace smartphones. Not wearable all day.
This means one thing: Meta is winning the market that is possible today, not the definitive one.
Competition intensifies in 2026
Meta is winning today on glasses. But the race is changing speed.
Google has announced two categories of Android XR smart glasses for 2026 — screen-free AI glasses with Gemini and display variants with in-lens HUD — with frame partnerships from Warby Parker (150 million commitment), Gentle Monster, and Samsung.
Samsung confirmed smart glasses for 2026 during its Q4 2025 earnings call, with a full AR variant coming in 2027.
Snap formed Specs Inc as a standalone subsidiary in January 2026, signaling a consumer AR launch after spending $3 billion on AR development over 11 years and building an ecosystem of 400,000 developers.
Apple glasses (codenamed N401) are expected to be unveiled in late 2026, with a launch in early 2027. First version reported: zero display — just AI, cameras, audio. Weight ~50g, battery ~8 hours. Ray-Ban Meta base type, no display. True AR with micro-OLED is Phase 2 around 2028.
China has launched the “War of 100 Smart Glasses.” Xiaomi sold 10,000 AI glasses in 12 hours in June 2025. Baidu launched glasses powered by ERNIE LLM. Alibaba, ByteDance, Tencent, Honor, even car maker Li Auto have entered the market. IDC projects shipments in China to exceed 4.9 million units in 2026.
In total, 36 manufacturers released over 50 AI-powered glasses worldwide in 2025.
But this does not change the central point:
As long as physics does not change, glasses remain an early adopter market.
Meta has a two-year head start in building its brand and ecosystem. It has a partnership with EssilorLuxottica for global retail distribution. It has full integration with Instagram and WhatsApp.
The real mass market competition will only begin when physics unlocks.
Entering now is not about beating Meta on numbers.
It's about not arriving late when physical constraints fall apart.
Privacy: two different issues
Both headsets and glasses raise privacy concerns. But not to the same extent.
Glasses have cameras that record those around you.
Meta has hardwired LEDs that light up when the camera is active (cannot be disabled via software), a physical power switch, and verified opt-in sessions for hands-free use.
But Meta's April 2025 policy update has raised criticism: voice recordings are now stored for up to one year by default for AI training, with the option to prevent storage removed. Photos processed by Meta AI are saved and used to train models with the help of “trained reviewers.” The Irish Data Protection Commission has requested stricter GDPR measures.
A 2024 PLOS One study of 1,037 Australians found strong concern among non-owners about privacy, anti-social behavior, and potential harm from smart glasses. Cultural factors matter: Korean respondents showed significantly greater privacy concerns than Americans.
The “Glasshole” stigma from Google Glass 2013 persists. Google faced Congressional inquiries, Consumer Watchdog called Glass “one of the most privacy-invasive devices ever created,” and social rejection killed the product.
Headphones primarily record yourself.
No camera = zero hidden visual recording, drastically reduced bystander anxiety. Apple's privacy model keeps conversations local by default with on-device processing. FDA-authorized hearing aid functionality normalizes continuous use, removing suspicion “why are they wearing earbuds?”
The tradeoff: without visual context, the headphones cannot identify objects, translate signs, or understand what you are looking at.
This difference weighs heavily on social acceptance. And it's another reason why headphones are more easily scaling to the mass market today.
Two scenarios to unlock glasses
Lightweight glasses are stuck until one of these two things happens.
Scenario A: Chips 10 times more efficient
Breakthrough neuromorphic that enables powerful multimodal AI vision processing in 50 grams with 8+ hours of battery life. Precedents suggest 5-10 years. Glasses become standalone, cloud problem disappears, mass market unlocks.
Scenario B: Breakthrough bandwidth offload
Advanced WiFi 7, improved ultra-wideband, or completely new technology makes it possible to offload video from glasses to phone with <100ms latency and acceptable battery consumption. Apple could do processing with iPhone Neural Engine while maintaining on-device privacy. Meta cannot do this because it does not control phone hardware.
Until then: glasses remain an early adopter category despite explosive growth.
Who wins (for now)
AI glasses seem like the future. And they probably are.
But today, the future has to contend with physics.
Headphones win the mass market because they have solved all three fundamental problems: design, AI, and hardware. Audio architecture + offload is elegant and works at scale.
Meta wins the glasses market because it has solved design and AI better than anyone else, building a brand and ecosystem as the market matures. 70% market share, 2 million units, EssilorLuxottica partnership.
Two strategies, both valid (at different points on the technology curve):
Meta is betting that in 5-10 years, when Scenario A or B comes to fruition, whoever has already built the glasses ecosystem will win the true mass market race. It is investing today to dominate tomorrow.
Apple is betting that until then, headphones will win — so it's worth dominating the present with 76 million units per year while studying which form factor will win when the physics are resolved.
Who is right does not depend on strategy. It depends on how quickly physical constraints will become acceptable to hundreds of millions of users.
The real asymmetry is this:
Meta risks being right too soon—and paying the cost of being ahead of its time.
Apple risks choosing too late—but without paying the cost of being wrong.
For now, physics votes for headphones.
Fabio Lauria
CEO & Founder, ELECTE

Welcome to the Electe Newsletter
This newsletter explores the fascinating world of artificial intelligence, explaining how it is transforming the way we live and work. We share engaging stories and surprising discoveries about AI: from the most creative applications to new emerging tools, and the impact these changes have on our daily lives.
You don't need to be a tech expert: through clear language and concrete examples, we transform complex concepts into compelling stories. Whether you're interested in the latest AI discoveries, the most surprising innovations, or simply want to stay up to date on technology trends, this newsletter will guide you through the wonders of artificial intelligence.
It's like having a curious and passionate guide who takes you on a weekly journey to discover the most interesting and unexpected developments in the world of AI, told in an engaging and accessible way.
Sign up now to access the complete newsletter archive. Join a community of curious minds and explorers of the future.
