Apple is accelerating work on a new generation of AI-powered wearables, according to reporting by Bloomberg’s Mark Gurman. The devices span smart glasses, a camera-equipped pendant, and AirPods with visual sensors. Together, they signal a strategic shift by Apple as it looks to stay competitive in an AI landscape increasingly defined by context-aware assistants.
This is not a single product push. It is a platform move.
Smart Glasses: Apple’s Long Game
At the center of the effort are smart glasses designed to bring AI into everyday life without a screen-first experience. The glasses are expected to include cameras, microphones, and speakers, enabling Siri to understand what a user is seeing and hearing in real time.
Unlike Apple Vision Pro, these glasses are positioned as lightweight, always-on companions rather than immersive devices. The goal appears to be subtlety. No displays. No overt interface. Just passive awareness feeding Apple’s AI systems.
Bloomberg reports that production could begin as early as late 2026, with a broader consumer launch likely in 2027.
The AI Pendant: Context Without the Phone
Apple is also exploring a pendant-style wearable roughly the size of a tag, designed to be worn around the neck or clipped to clothing. The device would feature a camera and microphone, capturing environmental data and sending it back to a paired iPhone for processing.
This approach allows Apple to extend AI perception without replacing the smartphone as the primary computing hub. It also reflects a pattern. Apple is distributing sensors while keeping intelligence centralized.
The pendant remains in early development and has not been finalized for release.
AirPods With Cameras: Seeing Through Sound
Perhaps the most surprising development is Apple’s work on AirPods equipped with small outward-facing cameras. These are not intended for photography or video recording. Instead, the cameras would provide spatial and visual awareness, allowing Siri to better understand a user’s surroundings.
Of the three product categories, these AirPods are reportedly the closest to market. Bloomberg notes they could arrive sooner than the glasses or pendant, positioning them as Apple’s first meaningful step toward wearable, multimodal AI.
Why Apple Is Taking This Route
Apple’s AI strategy has faced growing scrutiny as competitors push ahead with assistants that can process text, images, and audio seamlessly. Siri, by contrast, has struggled to keep pace.
Rather than chasing chatbot dominance, Apple appears to be reframing the problem. If AI can understand the world visually and spatially, then intelligence becomes ambient rather than conversational.
This plays directly to Apple’s strengths. Custom silicon. Tight hardware-software integration. Control over privacy architecture. Wearables allow Apple to advance AI capabilities without fully conceding the narrative around Siri’s limitations.
Privacy and Perception Risks
The expansion of always-on cameras raises inevitable privacy concerns. Apple has long positioned itself as a privacy-first company, emphasizing on-device processing and data minimization.
Still, the optics are challenging. Devices that continuously observe the world will test user trust and social norms. Apple is betting that its brand credibility can carry that burden.
The Bigger Picture
What Mark Gurman’s reporting makes clear is that Apple is not experimenting at the edges. It is laying the groundwork for a future where AI is not something you open, but something that quietly accompanies you.
This is a calculated shift. Less chat window. More context engine.
Whether consumers embrace that vision will determine whether these products redefine personal computing or become cautionary footnotes in Apple’s AI journey.
Either way, Apple has made its intent clear. AI is no longer confined to the screen. It is moving into the physical world.
