Sabrina Ortiz/ZDNET
Follow ZDNET:
Add us as a preferred source on Google.
ZDNET’s Key Takeaways:
Amazon is in the process of developing an advanced set of smart delivery glasses, a cutting-edge piece of wearable technology designed not merely as another consumer gadget but as a practical tool to elevate efficiency and safety in its vast logistics operations. These glasses are specifically engineered to assist delivery associates by identifying potential hazards, simplifying the process of locating packages, and ultimately ensuring that customers receive their deliveries in a smoother, faster, and more reliable manner.
The growing momentum behind AI-powered eyewear cannot be ignored. In recent months, major technology players such as Meta and Samsung have unveiled new additions to their smart glasses portfolios—Meta during its September Meta Connect event, and Samsung through subtle teasers hinting at an impending release. Yet, amid this consumer-focused frenzy, Amazon has deliberately chosen to venture down a different path. Instead of targeting lifestyle users or casual technophiles, its smart glasses serve a deeply functional purpose: optimizing the operational workflow of its delivery network and enhancing the daily experience of its drivers.
At its recent “Delivering the Future” event held in San Francisco, Amazon officially introduced these smart delivery glasses to the world. The company described them as an innovative tool intended to transform how delivery associates perform their tasks—improving both their safety on the job and the accuracy and timeliness of package delivery. In essence, Amazon has positioned this technology as a bridge between human effort and artificial intelligence, promising a seamless blend of practicality and advanced automation.
As detailed by Amazon, the glasses are capable of scanning packages, rendering real-time turn-by-turn walking directions, and capturing images of completed deliveries. These functions are executed entirely hands-free, allowing drivers to maintain focus without needing to constantly handle a smartphone. Behind the scenes, these features are orchestrated through complex integrations of artificial intelligence and machine learning algorithms, which together provide the system’s decision-making and processing power.
The operational foundation of the glasses rests on AI-powered sensing technologies, advanced computer vision, and an embedded camera that collectively generate a dynamic heads-up display. This digital overlay presents delivery personnel with essential information—navigation data, alerts regarding environmental hazards, and task-specific instructions—directly within their visual field, minimizing distraction. The system’s interface is reminiscent of the display technology found in the Even Reality smart glasses, offering a comparable in-lens experience that merges the physical and digital worlds.
For instance, once a driver parks safely near a delivery location, the system automatically projects key package details such as the delivery address and number of parcels expected. From there, the glasses guide the driver to locate the correct package inside the vehicle’s compartment by displaying alerts that confirm when the proper item has been identified. This intelligent contextualization ensures that deliveries proceed with precision and fewer errors.
Building on this, the glasses employ Amazon’s proprietary geospatial technology to offer step-by-step walking navigation, directing the associate from their parked vehicle to the exact delivery destination. If potential hazards—such as construction obstacles or safety concerns—arise along the path, the glasses provide immediate warnings and rerouting prompts, safeguarding the driver’s journey from start to finish.
Two integrated front-facing cameras play an additional role by capturing photographs of packages upon successful delivery—an activity that traditionally required the manual use of a mobile device. By automating such tasks, the system reduces inefficiencies while maintaining Amazon’s high standards for proof-of-delivery confirmations.
In terms of physical design, the glasses exhibit thoughtful adaptability. They can accommodate prescription lenses, ensuring usability for drivers who require vision correction, as well as light-adjusting lenses that respond dynamically to changing ambient brightness. Complementing the glasses is a compact controller housed discreetly within the delivery vest, allowing drivers to interact with the interface effortlessly. The hardware also includes a user-swappable battery engineered for extended, all-day operation, and an emergency button to provide instant access to assistance when needed.
During the live demonstration event, ZDNET’s Sabrina Ortiz tested the smart delivery glasses firsthand. As she reported, the in-lens display reproduced the text clearly and accurately as depicted in demonstration images. Using the pocket-sized controller, she was able to navigate seamlessly across multiple screens—from address details and customer notes to the package verification process and, finally, to the guided navigation instructions. Each transition felt intuitive and efficient, reflecting the user experience Amazon aims to deliver.
In terms of comfort, Ortiz observed that the glasses rested naturally on her nose bridge during the five-minute trial session. Both the weight of the device and that of the battery pack felt evenly distributed, a balance achieved by integrating the components into the vest design. Amazon seems to have prioritized ergonomic comfort and wearability, essential factors for professionals expected to wear such equipment for extended periods.
According to Amazon’s official communications, the design process behind the glasses was highly collaborative. Hundreds of delivery associates participated in testing early prototypes and provided detailed feedback that contributed significantly to improvements in comfort, usability, and functionality. This co-creation approach underlines Amazon’s focus on building technology that genuinely supports the needs of its workforce rather than merely showcasing innovation for its own sake.
Ortiz also spoke with a delivery associate operating in the San Francisco area who uses the smart glasses regularly. He corroborated Amazon’s claims, stating that the device feels lightweight and convenient during daily shifts, and that the battery easily lasts an entire workday without interruption. His feedback highlights the product’s readiness for real-world, high-intensity logistics environments.
Looking ahead, Amazon envisions even greater functionality for the smart delivery glasses. The company anticipates a future where the device will be able to detect in real time if a driver has inadvertently delivered a package to the wrong address, issue alerts if an animal or other obstruction is present in the yard, and possibly handle a range of other context-specific scenarios. Each feature points toward a more intelligent, responsive delivery ecosystem driven by AI cognition and environmental awareness.
At this stage, Amazon has not disclosed a definitive launch timeline for when customers might begin spotting drivers equipped with these advanced glasses, nor has it clarified the geographic scope of the initial rollout. Nevertheless, the unveiling marks a pivotal moment in Amazon’s ongoing effort to modernize last-mile logistics through wearable artificial intelligence.
Disclosure: Amazon covered the travel expenses for Sabrina Ortiz’s trip to San Francisco to attend the “Delivering the Future” event—an established industry practice for remote coverage. However, ZDNET’s editorial judgments, assessments, and conclusions remain entirely independent, reflecting the publication’s commitment to unbiased reporting.
Sourse: https://www.zdnet.com/article/your-amazon-driver-may-soon-deliver-with-these-smart-glasses-on-why-thats-good-news/