AirPods and Apple Watch Might Get Cameras by 2027 – Here’s What That Actually Means

AirPods and Apple Watch Might Get Cameras by 2027 – Here’s What That Actually Means

Photo of author
Written By Eric Sandler

Okay… this is wild.


Apple is reportedly working on putting cameras into your AirPods and Apple Watch, and if the latest Bloomberg report holds up, we could actually see these things launch around 2027.

Yep, your earbuds and wristwatch might soon be smarter than your phone.

Let’s break down what we know, what Apple’s planning, and why this could be a lot more about AI and spatial computing than selfies.

The Chips Behind the Magic: “Nevis” and “Glennie”

According to the report, Apple has already started working on two custom chips:

“Nevis” for the camera-equipped Apple Watch

“Glennie” for future versions of the AirPods

Both are apparently targeted to be ready by 2027, and if all goes smoothly, we might even see products launch in the same year.

So yeah, this isn’t just a sketch on a whiteboard in Cupertino, it’s happening.

Why Would AirPods Need a Camera?

A mock up of what AirPods with cameras could look like.

Now before you start imagining FaceTime calls from your ears (no thanks), that’s not what this is about.

Apple’s thinking infrared cameras—and here’s why that’s cool:

Spatial audio enhancement: Pair that with Vision Pro, and now your AirPods might know where your head and hands are. Boom—instant depth-aware audio placement.

Gesture control: Imagine skipping a song or adjusting volume just by tapping the air. We’re talking full Tony Stark vibes here.

AI-powered context: Your AirPods could literally “see” what’s around you and react accordingly—like lowering volume when someone walks up, or adjusting noise cancellation based on movement.

Basically, AirPods won’t just play sound. They’ll understand what you’re doing and where you are.

And the Apple Watch? Visual Intelligence FTW

Apple’s also exploring built-in cameras on the Apple Watch screen itself, or near the Digital Crown on a future Apple Watch Ultra.

But again—this isn’t about snapping wrist selfies.

These cameras are reportedly for:

Visual Intelligence – Think object recognition, real-world navigation, and instant context-aware notifications

Better AI interactions – Your watch could see where you are and adapt on the fly, giving you way more useful information when you need it.

So yeah, it’s not about Instagram. It’s about giving your devices eyes to see the world like you do.

The Real Star Here? AI + Spatial Computing

What’s really going on is this: Apple is setting up the AirPods and Apple Watch to be sensors—feeding real-world data to Apple Intelligence (aka Apple’s AI strategy) and devices like the Vision Pro.

Think of it like this:

Your AirPods know your position and movement

Your Watch knows what you’re seeing or interacting with

Your AI stitches it all together into a smarter, more responsive digital world

Suddenly, your daily walk, gym session, or commute becomes a fully interactive experience with real-time context, safety alerts, and intuitive feedback.

Final Thoughts

It’s still a few years off, but this is one of the boldest shifts we’ve seen Apple hint at in a while. Cameras in your ears and on your wrist may sound crazy now—but remember when we thought Face ID or AirPods themselves were weird?

Apple isn’t trying to make the Watch or AirPods “better” in the traditional sense.
They’re trying to make them see—and that changes everything.

Eric Sandler

Leave a Comment