Apple is reportedly accelerating work on three new wearable devices, viz., smart glasses, a clip-on pendant and camera-equipped AirPods, as part of a broader push into artificial intelligence-powered hardware. According to a Bloomberg report citing people familiar with the matter, the products are being built around Siri and designed to interpret visual context using integrated cameras. Apple hasn’t verified the plans, but the development signals Apple’s attempt to reposition itself in the fast-evolving AI hardware race, where companies such as Meta Platforms Inc. and OpenAI are already investing in next-generation wearables.
Apple’s most advanced project is a pair of smart glasses, internally code-named N50. Unlike mixed reality headsets like the Vision Pro, these glasses are not expected to include a display. Instead, they will rely on speakers, microphones and two camera systems, one for high-resolution capture and another dedicated to computer vision.
The glasses will connect to the iPhone and use Siri to process real-world input. Users could look at an object and ask what it is, scan printed text to add events to a calendar, or receive navigation cues based on visible landmarks.
Production could begin as early as December 2026, and a public release is possible in 2027.
Apple is said to be focusing on premium build quality and camera performance to differentiate itself from competitors like Meta’s Ray-Ban smart glasses.
Also Read: Google I/O 2026 scheduled for May 19–20: What to expect and how to watch the event live
Apple is also said to be developing a smaller wearable device that can be clipped onto clothing or worn as a necklace. It is internally described by some employees as the ‘eyes and ears’ of the iPhone, and the device would include a camera and microphone to provide always-on contextual input to Siri.
The Apple pendant would be different from the now-defunct Humane AI Pin. It would not include a display or projector, but rather rely on the iPhone for processing.
The product remains in early development and could still be cancelled. If approved, it may launch as early as next year.
Finally, the report adds that Apple is also preparing upgraded AirPods with camera sensors to enhance AI functionality. These earbuds are expected to assist Siri with spatial awareness and contextual understanding, rather than serve as photography tools.
Apple has already introduced AI features such as live translation in recent AirPods models. The next iteration would deepen integration between audio input, environmental awareness and on-device intelligence. Its launch could happen as early as this year.
So, like the broader industry, Apple is also accelerating its AI hardware plans. While iPhone sales remain strong, AI is expected to move more interactions away from screens and into ambient devices. Meta has already found early traction with its AI glasses, and OpenAI is reportedly working on new AI-native hardware. But unlike those brands, Apple can leverage ecosystem control and keep users locked into its services.
But Apple may be more careful as the Vision Pro headset, despite its technical strengths, struggled to achieve mass-market adoption due to pricing and use-case limitations. Apple is believed to have scrapped a lighter, cheaper headset variant to focus resources on smart glasses instead.
Let’s hope Siri and Apple Intelligence are up to snuff, and the company has found meaningful use cases for everyday use and addressed the privacy concerns. We may officially learn more about Siri revamp later this year and hints about Apple’s hardware plans at next WWDC.
Keep reading Digit.in for such information.