How To Enable and Use Eye Tracking on iPhone and iPad

How I finally got eye tracking working on my iPhone and iPad after a long struggle

So if you’re like me, you might’ve looked everywhere in Settings and still couldn’t find the eye tracking options or it just wouldn’t activate. Honestly, I got stuck because Apple’s documentation is kind of sparse about the nitty-gritty details. But after some trial and error, here’s the deal—this feature is only available with devices running iOS/iPadOS 18 or later. And not all models support it equally, so certain iPads like the 10th gen, or iPads with the A16 chip (like the latest iPad Air 5), are supported. Same goes for the iPhone 12 and up, including the iPhone 16 series and 3rd gen SE. Just make sure your device’s software is up to date—you’ll want to check in Settings > General > Software Update to be safe. Otherwise, no way you’ll see the eye tracking options, trust me.

Getting it set up—what I missed the first few times

This part is more of a “you gotta do it right” kind of thing. On my older ASUS, I thought maybe density or lighting wasn’t good enough, but on iOS that’s actually super important too. For starters, your device needs to be on a stable surface, and if you’re using an iPhone, about a foot away. For iPads, about 1.5 feet should be fine. The front camera needs a clear shot of your face, and lighting needs to be decent—kind of obvious, but yeah, poor lighting or glare from bright windows totally throws it off. I spent ages trying different setups, so don’t get discouraged if it doesn’t work immediately.

The calibration process (which took me a few tries)

Diving into Settings > Accessibility > Eye Tracking, if you don’t see “Eye Tracking” right there, look under Physical and Motor. And make sure to toggle Allow Eye Tracking on—sometimes it’s grayed out, especially if your device isn’t supported or the software isn’t current. When you start calibration, it’ll show a simple moving dot around the screen—like an old Apple accessibility tutorial. You or a helper taps to initiate calibration, then follow the dot with your eyes. The tricky part? Keeping your head still, just moving your eyes to follow it. It’s a bit weird at first, and the calibration can take a minute or two if lighting isn’t perfect or if you’re not perfectly centered. The data ends up stored in /Library/Preferences/com.apple.eyesight.plist—if you’re into snooping around system files or troubleshooting. Once calibration’s done, a cursor should appear where your gaze is focused, and it *sort of* works after a couple of attempts.

How to really make it usable—initial impressions

With calibration done, you get a cursor that follows your eyes. Honestly, the first time I saw it move around, I was pretty impressed but also a little spooked by how finicky it was. Environmental factors matter a lot—things like lighting or if your head shifts even a little, it can drift or get confused. The system shows outlines around objects or buttons when you’re looking at them, and a dwell timer—so you can “click” by just staring at something long enough. You can set the dwell time in Settings > Accessibility > Eye Tracking > Dwell Time. I found setting it to around 1.2 seconds was a sweet spot for me, but some prefer shorter or longer durations depending on how steady their gaze is.

Performing actions & controlling your device

This is where it got more interesting—I kept trying to click things just by staring, and sometimes it worked, sometimes not. To select a button or item, look at it until it highlights, then hold for the dwell timer. It’s almost like a mouse hover, then click. For example, to tap a button, fix your gaze for a second or so, and the system activates it. For typing or scrolling, it’s similar—gaze at the keys, then look at the send or scroll buttons to activate. Honestly, it takes some getting used to, and I’d recommend testing out simple tasks first to get the hang of dwell timings and how precise your eye movements need to be. If you’re worried about accidental clicks, setting up an alternate input method can help.

When things go sideways—tips for troubleshooting calibration drift

Calibration isn’t always perfect. If it gets out of sync or starts acting wonky, you can recalibrate pretty easily. Just hold your gaze at the top-left corner of the screen for a couple of seconds or go back into Settings > Accessibility > Eye Tracking and tap “Recalibrate.” Sometimes, it’s just lighting changes or head movement that mess things up. Reposition the device, improve lighting—these small things make a big difference. The calibration data is stored in /var/db/Accessibility/FaceIDLegacy.dat; if you want to dig into system files, that’s where it lives. Clearing out the old calibration data also helps if accuracy is off. It’s not always perfect out of the box, but with some patience, it gets better.

Final thoughts—worth the hassle?

Frankly, after finally figuring out all the quirks and calibration issues, I think eye tracking has a lot of potential, especially for accessibility or just for fun. It’s early days, so the tech isn’t flawless—there are false triggers, drift, and environmental quirks—but the core idea is pretty cool. It definitely takes a bit of patience, and calibration can be finicky, especially if the lighting isn’t ideal or your head moves a lot. The learning curve is real, but once you get it dialed in, it really can be useful. Just remember: support varies, so your device needs to be compatible and fully updated. Also, note that clearing calibration data or toggling features sometimes requires a reboot or resetting settings.

Hope this helped — it took me way too long to figure it out myself. Double-check your device compatibility, keep everything updated, and spend a little time dialing in calibration. Those tweaks make all the difference. Anyway, good luck, and I hope your tracking experience is smoother than mine was!