
Apple is working on AirPods with built-in cameras to enhance its Apple Intelligence capabilities, according to Bloomberg's Mark Gurman in his Sunday newsletter.
The cameras would collect environmental data to support AI features. For example, the system could scan local signs to determine location or use storefront imagery to provide directions to nearby shops. This would extend Apple's Visual Intelligence feature beyond the iPhone 16's camera-based implementation.
Gurman reports that Apple is "actively developing" this product but doesn't provide extensive details.
This camera-equipped AirPods concept has appeared in multiple rumors since October, with potential release estimated to be two to three years away. The cameras might use infrared sensors for depth mapping rather than full-color video, which could improve power efficiency while supporting navigation features.
Adding cameras to AirPods offers several advantages over smart glasses. The earbuds could provide a wider viewing angle than the typically front-facing cameras in smart glasses. Weight distribution is also less problematic in earbuds than in glasses, which must remain lightweight for comfort during extended wear.
Apple has already integrated additional sensors into audio products, as seen in the Beats Powerbeats Pro 2, which includes blood monitoring sensors in each earbud that transmit data to an iPhone.
Camera-equipped AirPods could deliver many smart glasses functionalities without requiring users to wear glasses or replace their existing prescription frames. This approach might appeal to those who prefer not to wear glasses or who recall the negative "glasshole" reaction to previous smart eyewear products.
If Apple successfully implements this technology in a way that resonates with consumers, it could represent a significant alternative to visual wearables in the augmented reality market.