It is amazing how well iOS supports these accessibility features but doesn't consider blocking video autoplay on websites, something that is incredibly distracting for people with ADHD.
There are some downright neat things in the iOS accessibility options. Example: you can set it so that a triple tap on the rear of the phone turns the flashlight on/off. People think it’s witchcraft how fast I can pull the phone out and switch it on without looking down.
Amazing how good eye tracking works on my phone (15 Pro).
Unfortunately, there seems to be no way to press buttons via blinking, only by "dwelling" on an item for a few seconds, which makes using my phone feel quite hectic and prone to to inadvertent inputs to me.
Cursor tracks ok, but the implementation seems to replace a low-level pointing device. I.e., it's very precise and jittery - all attribution and no salience.
Also maybe like Siri it should be modal. E.g., dwell away to silence, and then dwell leading corner to say "Hey, listen..."
Holding the phone seemed to cause problems ("you're not holding it right"). Probably best with fixed positioning, e.g., attached to a screen (like a continuity camera, assuming you're lying down with a fixed head position.
Tracking needs a magnetic (gravity well?) behavior, where the pointer is drawn to UI features (and the user can retract by resisting). Salience weighting could make it quite useful.
It's possible that weighting could piggyback on existing accessibility metadata, or it might require a different application programming model.
Similarly, it would be interesting to combine it with voice input that prioritized things near where you are looking.
I'm willing to try, and eager to see how it gets integrated with other features.
I too tried this for a short while and was not impressed. However, I can’t help but wonder how ‘good’ I could get at using it if I invested more time in it. Would love to hear from someone who truly uses this tool regularly. Flying a plane is also quite cumbersome the first 15 minutes.
Eye tracking is very impressive technology, and foveated rendering is an excellent application, but eye control is poor UX. Yes, I'm saying the emperor has no clothes.
Imagine having a jittery cursor in the center of your vision at all times? If I had a mouse/trackpad working like that it would immediately be replaced but that's Apple's eye control. Imagine scrolling a page and every where you glance there's a spotlighted/popup control or ad? That's Apple eye control utilizing dwell and snap to item.
It's telling that the 'best window' apps/use cases for Vision Pro are video watching and Mac virtual display which has low reliance on eye control during usage. Trying to browse the web with Apple's eye control is a clear regression compared to touch/keyboard/mouse/trackpad only useful as an accessibility feature
I played around with this a bit. Doesn’t work amazing on my iPhone model (SE 3rd gen), but it’s pretty cool. I don’t think there’s an API to use it in apps yet, but I would love to make an eye controlled mobile game.
If we were to expand on face control scheme furthur, what face gestures would be used for clicking/tapping? click holding? What would be least exhausting to face muscles, what would look least ridiculous?
I see lots of people walking and scrolling on their phone. Every once and a while they look up and continue. What will happen when you control it with your eyes and you look up, will it scroll?
How long until this is turned on silently across all devices and adtech folks, native mobile apps, and website operators are able to use your eye tracking data for A/B testing?
The selfie normalized surveillance. Social media normalized "transparency" (ie, posting every little dumb detail about yourself". Advertisements invading all aspects of life (tv, radio, streaming services, ads in your taskbar).
Control iPhone with the movement of your eyes
(support.apple.com)97 points by 9woc 11 November 2024 | 60 comments
Comments
Unfortunately, there seems to be no way to press buttons via blinking, only by "dwelling" on an item for a few seconds, which makes using my phone feel quite hectic and prone to to inadvertent inputs to me.
Also maybe like Siri it should be modal. E.g., dwell away to silence, and then dwell leading corner to say "Hey, listen..."
Holding the phone seemed to cause problems ("you're not holding it right"). Probably best with fixed positioning, e.g., attached to a screen (like a continuity camera, assuming you're lying down with a fixed head position.
Tracking needs a magnetic (gravity well?) behavior, where the pointer is drawn to UI features (and the user can retract by resisting). Salience weighting could make it quite useful.
It's possible that weighting could piggyback on existing accessibility metadata, or it might require a different application programming model.
Similarly, it would be interesting to combine it with voice input that prioritized things near where you are looking.
I'm willing to try, and eager to see how it gets integrated with other features.
Imagine having a jittery cursor in the center of your vision at all times? If I had a mouse/trackpad working like that it would immediately be replaced but that's Apple's eye control. Imagine scrolling a page and every where you glance there's a spotlighted/popup control or ad? That's Apple eye control utilizing dwell and snap to item.
It's telling that the 'best window' apps/use cases for Vision Pro are video watching and Mac virtual display which has low reliance on eye control during usage. Trying to browse the web with Apple's eye control is a clear regression compared to touch/keyboard/mouse/trackpad only useful as an accessibility feature
The selfie normalized surveillance. Social media normalized "transparency" (ie, posting every little dumb detail about yourself". Advertisements invading all aspects of life (tv, radio, streaming services, ads in your taskbar).