
Apple is working on three new wearable devices, according to a new report from Bloomberg’s Mark Gurman. The company is allegedly developing smart glasses, AirPods with cameras and “expanded AI capabilities”, and “a pendant that can be pinned to a shirt or worn as a necklace”.
All three of these are being “built around the Siri digital assistant”, which will “rely on visual context to carry out actions”. All of the products will be paired to an iPhone, and all of them will have cameras, but the AirPods and pendant will come with lower-resolution cameras that are there only to help the AI, and not to actually take photos or videos.
The glasses, on the other hand, will be “more upscale and feature-rich”, with better cameras. Apple CEO Tim Cook revealed at an all-hands meeting with employees earlier this month that the company is working on new categories of products enabled by AI. Cook said “the world is changing fast” and that’s why Apple is investing in new technology.

Before it launches entirely new product categories like these, maybe Apple should finally launch the comically delayed smarter, AI-powered Siri after all? Even for that it needed Google’s help, so it’s unclear how good the glasses, pendant, and new AirPods will actually be.
And speaking of the pendant, a similar idea was tried a few years ago by Humane with disastrous results, and the obvious question remains: why wouldn’t you just use your phone? Or your smartwatch? Apple’s pendant will be an iPhone accessory and not a standalone device that aims to replace your iPhone, like the Humane AI Pin, serving as an always-on camera for the iPhone, which also includes a mic for talking to Siri.
Some Apple employees allegedly describe it as the “eyes and ears” of the iPhone, which definitely doesn’t sound creepy at all. The pendant won’t even have as much processing power as an Apple Watch, it being “closer in computing power to AirPods than an Apple Watch”, which once again begs the question of why this is being developed at all. Anyway, it’s coming next year if it doesn’t get cancelled in the meantime.
Apple’s smart glasses are apparently coming next year too, and won’t have a screen. But they will have speakers, mics, and cameras. Speaking of which, Apple reportedly wants to differentiate them from Meta’s Ray-Bans by giving them better cameras and better build quality. The frames are developed in-house by Apple “in a variety of sizes and colors”.
They will have two cameras, one for “high-resolution imagery” and one dedicated to computer vision – described as “a technology similar to what’s used in the Vision Pro”. It will “give the device environmental context, helping it more accurately interpret surroundings and measure distance between objects”.
That’s because the goal of these glasses is presumably to function as an all-day AI companion, seeing what you’re doing in real time (until the battery runs out). You’ll be able to look at an object and ask what it is and get assistance with everyday tasks. The glasses will also add event data from a poster directly to a calendar, and create context-aware reminders, like prompting you to grab an item when you’re looking at a specific shelf in a grocery store.
For navigation, Siri will reference real-world landmarks, telling you to walk past a described vehicle or building before making a turn. And all of this is only slightly more convenient than on your phone, which means Apple really has to nail the design of these glasses for them to be successful.
All of these products seem like solutions in search of a problem to fix, or, to put it bluntly, Apple is desperately trying to find the next big thing, since the Vision Pro definitely wasn’t.