Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Meta’s display-toting AI smart glasses could spoil Apple’s party in 2025

Phil Nickinson wearing the Apple AirPods Pro and Ray-Ban Meta smart glasses.
There are times that only the AirPods Pro will do. Bu the Ray-Ban Meta often are the only "earbuds" in my bag. Phil Nickinson / Digital Trends

Meta has tasted quite some unprecedented success with its camera-equipped smart glasses made in collaboration with Ray-Ban. They started off with social media capture as their key trick, but have now evolved into a vehicle for AI features

Now, Meta is reportedly eyeing next-gen smart glasses that add a display unit into the mix. Interestingly, they could arrive in the same window that is usually reserved for the launch of new iPhones and other Apple gear in the Fall season. Apple’s smart glasses project, on the other hand, is reportedly a few years behind the competition.

Recommended Videos

“Meta Platforms Inc. has a major product priority for 2025: an October launch of high-end, $1,000-plus smart glasses that include a small display,” says a Bloomberg report. They could run into a slight delay, but Meta’s employees have reportedly been told to race against time, even if that involves working on weekends. 

The purported October launch target roughly aligns with the time frame for Meta’s annual Connect event. The company’s flagship hardware event is set for September 17-18 in 2025. Apple also unveils its next-gen iPhones in the same month, but this year, we could be in for a headset surprise, as well. 

What’s next for Meta’s smart glasses?

So far, Meta hasn’t tried smart glasses with a display unit, but the idea has been executed for years by labels such as Viture, TCL, RayNeo, and Xreal, among others. The biggest advantage is that instead of relying solely on voice/audio inputs, users can visually interact with content on a small display unit fitted behind the front lens. 

Notably, the Meta smart glasses will come with a single display unit, instead of separate OLED panels for each eye. “Unlike Meta’s current spectacles, this model will let wearers use simple on-screen apps and review photos that they’ve taken with the device,” adds the Bloomberg report.

The monocular screen is said to be fitted in the lower right corner of the lens, targeting visual interactions only through the right eye. Hand gestures on side arms will enable navigation, while display functionalities include map navigation, handling app notifications, viewing pictures clicked by the onboard cameras, and more. 

On the software side, Meta’s upcoming smart glasses are said to run a heavily customized version of Android, while silicon firepower will be provided by a Qualcomm wearable processor. 

Currently in development under the codename Hypernova, the next-gen Meta AI smart glasses could start at $1,000, but the asking price might climb all the way up to $1,400. For comparison, the typical ask for display-equipped XR glasses currently hovers around the $500 mark. 

Meta is also working on the Orion smart glasses with a built-in holographic display system. The company has already showcased them in detail, but they are still a few years in the future. On the research side, the company continues to develop true AR glasses aboard the Aria platform, targeted at scientific and industrial applications. 

Nadeem Sarwar
Nadeem is a tech and science journalist who started reading about cool smartphone tech out of curiosity and soon started…
This AI app boosts my productivity in a way that Apple Intelligence can’t
A microphone in front of a MacBook on a desk.

Apple Intelligence offers a bunch of interesting features, but if you’ve tried most of them for more than a few minutes, you realise they’re not quite up to scratch compared to the best artificial intelligence (AI) tools.

Image Playground is fine, for instance, but not particularly useful. Writing Tools work well enough but aren’t as ground-breaking as Apple might make out. And the less that’s said about Siri, the better.

Read more
Apple Intelligence could solve my coding struggles — but this key feature is nowhere to be seen
Coding on a MacBook

About a year ago, I started learning how to code in Swift, Apple’s app development language. The idea was to eventually be able to build my own iOS apps from scratch and rediscover the fun of coding.

After a while, though, I began to lose interest. My last coding practice was almost 20 years ago when I taught myself HTML and CSS, and getting back into the mindset was hard. I also didn’t have a specific app goal in mind, meaning the drive to push through the tough sections wasn’t there.

Read more
Ray-Ban Meta AI glasses go high fashion with Coperni limited edition
Angled view of Ray-Ban Meta x Coperni Limited Edition Glasses.

Meta delivered an unexpected runaway success with its Ray-Ban Stories smart glasses, and now, it is headed to the runaway for the latest take. At the Paris Fashion week, the company lifted the covers from the Ray-Ban Meta x Coperni Limited Edition Glasses.

Revealed as part of Coperni’s Fall Winter 25 collection, these are the company’s “first-ever fashion-branded collaboration.” The collaboration product borrows Ray-Ban’s iconic Wayfarer look and gives it a translucent twist atop a black-grey framework.

Read more