Thursday, January 8, 2026

Meta is expanding the practical use cases of smart glasses

Meta is expanding the practical use cases of smart glasses with the introduction of a teleprompter-style notes feature for its Meta Ray-Ban Display glasses. The new capability allows users to privately view scrolling notes within their line of sight—ideal for presentations, content creation, or on-camera recordings—without needing a phone or external screen.

What sets this feature apart is its hands-free control system, powered by the Meta Neural Band, a wrist-worn device that interprets muscle signals to detect subtle finger movements. This enables users to scroll and navigate text naturally, without breaking eye contact or drawing attention.

How the Teleprompter Works

Using the teleprompter is designed to be simple and creator-friendly:

Import your script: Users can copy text from any mobile app—such as Google Docs or a notes application—and paste it directly into the Meta AI app.

View inside the glasses: Once uploaded, the content appears as customizable, floating text cards within the glasses’ display, scrolling smoothly in front of the wearer.

Gesture-based navigation: The Meta Neural Band reads electromyography (EMG) signals from the wrist, allowing users to control scrolling speed and movement with minimal finger gestures.

Who It’s For

The feature is clearly aimed at creators, professionals, educators, and public speakers who want a discreet way to reference notes while maintaining a natural presence. Whether recording videos, delivering talks, or rehearsing scripts, the tool eliminates the need for visible prompts.

Availability and Rollout

The teleprompter feature is currently being rolled out gradually to Early Access users in the United States. Meta has temporarily halted international availability of the Meta Ray-Ban Display glasses, citing strong US demand and limited supply.

A Step Toward Seamless Wearable Control

By relying on the Neural Band for navigation, Meta is signaling a broader shift toward gesture-driven, screen-free interaction across its wearable ecosystem—reducing friction between users and digital content.

By – Aaradhay Sharma

No comments:

Post a Comment

OpenAI’s Quiet Hardware Revolution: A Screenless AI Device May Arrive in 2026

 For years, OpenAI has lived almost entirely on screens—inside browsers, apps, and developer dashboards. That’s about to change. Behind cl...