Mark Zuckerberg announced the Meta Ray-Ban Display glasses at the company’s Connect event on Wednesday. The glasses have a colour display built into the lenses and work with the Meta Neural Band, which is a wristband that reads muscle signals from the hand. Together they let people check messages, translate speech or control music without pulling out a phone.
The display sits off to the side of the lens so it does not block vision. It is designed for short moments, such as checking walking directions or looking at a WhatsApp message. Zuckerberg said the idea is to keep people tuned into the world around them while still having quick access to digital tools.
Every pair comes with the Neural Band. It picks up electrical activity from the wrist and translates that into digital commands. A pinch of the fingers, a swipe of the thumb or even barely noticeable movements can control the glasses.
How Does The Neural Band Work?
Meta has spent years working on electromyography. Nearly 200,000 people took part in tests, which helped the company design the band so it works straight away for most users. It can detect movements before they are visible, which opens the door for people who cannot perform large gestures due to conditions such as spinal cord injuries or tremors.
The Neural Band is made from Vectran, the fibre used in NASA’s Mars Rover landing gear. It is flexible but strong and Meta says it can last through all-day use. Battery life runs up to 18 hours, while the glasses can go as long as 30 hours when charged with their portable case.
The glasses have Transitions lenses, so they can be worn both indoors and outdoors. They come in black or sand and the price is $799 for the set. Sales begin in the United States on 30 September at Best Buy, LensCrafters, Sunglass Hut and Ray-Ban Stores. Canada, France, Italy and the UK are expected to get them in early 2026.
More from News
- First Google, Now Microsoft Investing Millions Into UK Government. Here’s Why
- Why Is The Crypto World Criticising The Bank Of England’s Stablecoin Cap?
- Why Is Google’s £5bn Investment In UK Good For The AI Industry?
- Over Half Of Brits Plan To Use AI To Complete Their Tax Returns
- H.I.G. Capital Completes Acquisition Of Kantar Media
- Nofence Raises Over £26 Million In Series B Funding To Accelerate UK Expansion And Global Growth
- The Rise Of Agentic Commerce: Are Gen Z and Millennials Using AI To Shop More?
- How Much Capital Is Tech And Innovation Saving UK Hospitals?
What Can The Glasses Do?
Meta Ray-Ban Display has cameras, microphones, speakers and AI in one frame. Users can read messages from WhatsApp, Messenger, Instagram and their phones directly in the lenses. They can also take video calls, showing contacts what they are seeing through the built-in camera.
Other tools include a real-time viewfinder with zoom for photos, walking directions in certain cities and gesture-controlled music playback. The glasses also offer live captions and real-time translations in select languages. Meta says this feature makes it easier to stay in conversation while following along in another language.
The glasses add to Meta’s range of wearables. The company now has three categories: camera glasses, display glasses such as the Ray-Ban Display and AR prototypes such as Orion.
How Does Garmin Fit Into This?
On the same day, Garmin said it is working with Meta on the Oakley Meta Vanguard glasses, which are aimed at athletes. These glasses link with Garmin smartwatches and cycling computers and pull in biometric data.
Users can activate the feature with “Hey Meta” and hear stats such as pace, speed and heart rate. The glasses also have a coloured LED that signals if an athlete is inside their set training zone. Another function captures short video clips during milestones, which can later be edited into highlight reels on the Meta AI mobile app.
Susan Lyman, Garmin’s Vice President of Consumer Sales and Marketing, said in a press release that the aim is to help athletes improve performance while staying in the moment. Garmin is also letting users overlay stats such as distance and speed onto videos and photos, which can then be shared on Instagram, Facebook or WhatsApp. The service is launching in the United States first.