Meta’s AI glasses will now help you read, write, and not look at your phone
html PUBLIC “-//W3C//DTD HTML 4.0 Transitional//EN” “http://www.w3.org/TR/REC-html40/loose.dtd”

Meta used CES 2026 this week to signal a major evolution for its wearable ecosystem, moving the Meta Ray-Ban Display and Meta Neural Band from high-tech novelties to versatile tools for work, travel, and accessibility. The update marks a shift toward “head-up” computing, where your glasses act as your screen and your wrist acts as your controller.

The most practical update is the new Teleprompter feature, which is starting its phased rollout this week. It allows you to view notes or full scripts directly inside the glasses’ lens. Using the Neural Band, you can discreetly scroll through the text at your own pace, making it a powerful tool for public speakers, content creators, or anyone who wants to deliver a presentation while maintaining perfect eye contact.

Meta is also opening early access for EMG-based handwriting input in the U.S

By wearing the Neural Band, you can “write” messages by simply tracing letters with your finger on any surface—or even in the air. The band’s sensors detect the subtle muscle signals in your wrist and translate them into text for WhatsApp or Messenger. It’s a glimpse into a future where you can send a text without ever touching a screen or speaking out loud.

Meta

Navigation is also getting a boost. Meta’s pedestrian navigation is now live in 32 cities, with Denver, Las Vegas, Portland, and Salt Lake City joining the list this week. The turn-by-turn directions appear right in your field of vision, so you never have to look down at a phone while walking through a new city.

Despite the feature excitement, Meta confirmed some frustrating news for international fans

The planned rollout in the UK, France, Italy, and Canada has been paused. Due to “unprecedented demand” and inventory shortages in the U.S., waitlists already stretch well into 2026, forcing Meta to prioritize existing orders before expanding globally.

Meta

On the innovation side, Meta showcased a “proof-of-concept” partnership with Garmin. In the demo, passengers used the Meta Neural Band to control Garmin’s Unified Cabin infotainment system. By using simple finger gestures like pinching or scrolling, passengers could manage music and climate controls without reaching for a physical screen—a concept designed for the “lean-back” luxury of future autonomous or high-end vehicles.

Recommended Videos

Perhaps the most impactful announcement was a new research collaboration with the University of Utah. The project explores how the Neural Band can act as a life-changing interface for people with limited mobility, such as those living with ALS or recovering from a stroke.

Because the band reads neural signals rather than physical movement, it can detect intent even when muscles are weak. Researchers are testing how these subtle signals can control smart home devices (like lights and blinds) or even steer the TetraSki, an advanced sit-ski designed for people with complex disabilities. This research highlights Meta’s long-term goal: creating a computing platform that is truly accessible to everyone, regardless of physical ability.

Need help?

Don't hesitate to reach out to us regarding a project, custom development, or any general inquiries.
We're here to assist you.

Get in touch