Just a few weeks ago, I found myself in a fascinating conversation with a colleague who happened to be wearing glasses. At first glance, nothing appeared unusual about them—ordinary eyewear, neat, professional, entirely inconspicuous. What I failed to realize immediately was that embedded into these glasses was a nearly invisible display, positioned directly in front of one eye, practically hidden from external view. When I eventually saw a monitor reflecting what was streaming on those lenses, I was struck with genuine astonishment. Before my eyes, my colleague Victoria Song fluidly interacted with a wide array of tasks: she drafted and sent WhatsApp messages with natural ease, utilized the glasses as a direct camera viewfinder to line up photos, adjusted Spotify’s music volume by simply rotating her hand with the deliberate gesture of twisting an imaginary control knob, and seamlessly accessed map directions displayed directly in her field of vision. Yet from my perspective, even while maintaining eye contact with her, the screen remained imperceptible; I could tell she was focusing on something within the glasses, but intriguingly, I could not see the display itself at all.

That brief hands-on session was my very first encounter with the Meta Ray-Ban Display, a pair of smart glasses that incorporate a monocular screen. The demonstration left me deeply impressed, not only because of the technological capabilities but also because the hardware was concealed within a design that, while a touch bulkier than traditional frames, could convincingly pass as a pair of ordinary Ray-Ban eyeglasses—fashionable, stylish, and familiar. The fact that Meta partnered with Ray-Ban, a brand long associated with mainstream eyewear culture, further amplifies the sense of plausibility that people might actually wear these in daily life. You could easily imagine them blending into urban settings without drawing a second glance.

As I walked away from this demonstration, an immediate and inevitable thought occurred to me: what if Apple produced its own variation of these glasses? Such a concept feels so natural that it seems almost obvious. Picture, for instance, lightweight glasses that pair wirelessly with an iPhone, housing miniature speakers and a discreet front-facing camera at eye level, along with a private screen that projects directions, notifications, or music information exclusively to the wearer. The convenience of having personal alerts hovering silently in your visual field, visible only to you, would be transformative for both communication and productivity.

It appears that Apple’s leadership shares a similar vision. According to a new report from Bloomberg, the company has decided to deprioritize its development of a slimmer iteration of the Vision Pro headset, choosing instead to accelerate its smart glasses initiative. Apple’s current strategy reportedly encompasses both glasses with integrated displays and simpler versions without one.

Even a minimal model devoid of a screen could prove wildly successful for Apple. Consider how monumental AirPods became—essentially sleek wireless earbuds designed around seamless integration with iOS. Now reimagine that same ecosystem advantage applied to sunglasses or prescription frames. Even in its most basic form, simply providing premium eyewear enhanced with speakers and Apple’s iconic design sensibilities could be revolutionary. Moreover, the earliest iterations are rumored to incorporate cameras, which raises complicated questions. Although I personally remain somewhat cautious about the social implications of equipping faces with cameras, the undeniable success of Meta’s Ray-Ban glasses demonstrates that consumer demand exists on a significant scale. Millions have already embraced such products, signaling a fertile market.

Meta’s glasses, despite being restricted by Apple’s iOS ecosystem in terms of functionality—given Apple’s well-known constraints on external developers—have nonetheless proven to be popular. Imagine instead what Apple could accomplish when operating fully within its own environment. Such glasses would almost certainly integrate flawlessly with iMessages, Apple Music, iCloud Photos, Apple Maps, and the broader suite of native applications users already rely on. The continuity across devices, whether unlocking an iPhone, checking an Apple Watch, or opening a MacBook, cultivates an expectation of instantaneous access. Apple could extend that same convenience into glasses, transforming them into a natural extension of one’s digital life. Furthermore, Apple has an exceptional track record of designing miniaturized yet powerful hardware components, as evidenced by products like the Apple Watch or AirPods. This engineering expertise strongly suggests their glasses could immediately raise the bar for industry standards.

Nevertheless, optimism must be tempered. Apple’s timeline for releasing these glasses remains uncertain and likely distant. Bloomberg indicates that the earliest unveil could occur next year for a model without a display, with an actual release not expected until 2027. Glasses featuring a display may not arrive until 2028. This delay grants Meta a substantial head start, giving it several additional years to refine, iterate, and distribute its own hardware to consumers before Apple enters the field.

Meta, meanwhile, has been actively pursuing augmented reality in ambitious ways. After showcasing its experimental Orion AR glasses—which project digital objects into the real-world environment—Meta has firmly positioned itself as a major player racing toward mainstream AR adoption. Apple will not only face Meta but also serious competition from Samsung and Google, both of which are rumored to be developing their own AR eyewear. Smaller hardware companies continue proposing creative interpretations of wearable technology, and even former Apple design chief Jony Ive is rumored to be collaborating with OpenAI on AI-powered glasses. In other words, the competitive landscape is already crowded and rapidly advancing.

Meta’s relentless push into this domain stems from the company’s ambition to disrupt the smartphone’s longstanding reign as humanity’s primary computing device. CEO Mark Zuckerberg has openly admitted that one of his most formative experiences has been designing Meta’s services under the restrictive boundaries of platforms controlled by Apple. For Meta, smart glasses represent liberation from that dependence. On the opposite side, Apple is maneuvering defensively, seeking to guarantee its dominance continues unchallenged and striving not to miss the next technological leap. In an era where Apple is seen as somewhat behind in artificial intelligence while Meta surges ahead in wearable interfaces, the stakes are extraordinarily high.

Yet history reminds us that Apple often arrives late only to redefine categories that others pioneered. The company did not introduce the first MP3 player, nor did it debut the first smartphone, and yet the iPod and iPhone reshaped entire industries. Thus, although Apple lags behind in the early stages of the smart glasses race, it remains entirely plausible that its eventual product could alter the trajectory of wearable technology. Personally, I cannot help but hope the company delivers on the whimsy of calling its future eyewear ‘iGlasses’—a name too perfect to ignore.

Sourse: https://www.theverge.com/news/790697/smart-glasses-race-apple-meta-ray-ban-display