How Well Does Eye Tracking on iPadOS 18 Work Hands On Demo!

Eye Tracking Feature Still Has Ways to Go

As I sat down with my iPad Pro, I couldn't help but feel a sense of excitement and curiosity about one of its most innovative features: eye tracking. This technology allows users to control their device with just a glance, which promises to revolutionize the way we interact with our devices. However, in this early beta version of iOS 18, the eye tracking feature still has some major kinks.

At first glance, it seemed like the feature was working as promised. I would look at an icon or application, and suddenly it would spring into action. But as I delved deeper, I realized that the feature wasn't quite as seamless as Apple had promised. The camera kept having trouble tracking my eyes, and I often found myself being prompted to recalibrate the feature multiple times. It was like the system was still learning how to work with this new technology.

One of the most frustrating aspects of the eye tracking feature is its limited usability on the iPad Pro. The larger screen size makes it harder for the camera to accurately track my gaze, resulting in a more erratic experience. On the other hand, I found that the feature worked much better on the smaller screens of my iPhone. However, even on the iPhone, there were times when the eye tracking failed to register my gaze, forcing me to re-calibrate the feature.

The lack of precision and accuracy with the eye tracking feature is likely due to the camera system's ability to detect subtle movements. Unlike the Vision Pro, which uses a multitude of cameras to track every movement, the iPad Pro's camera can only track one camera. This results in a less accurate experience, especially when it comes to complex interactions.

One of the most interesting aspects of the eye tracking feature is its potential for accessibility. For users with disabilities, this technology promises to open up new possibilities for interaction and communication. However, as I experienced firsthand, there's still much work to be done to make this feature truly usable.

Despite the current limitations, I'm excited to see where this technology will go in future iterations of iOS 18. Apple has shown a commitment to accessibility, and it's clear that they're pushing the boundaries of what's possible with their devices. The eye tracking feature is just one example of this innovation, and I'm eager to see how it will continue to evolve.

In contrast to my experience on the iPad Pro, I was pleasantly surprised by how well the eye tracking feature worked on my home screen. As I looked at various applications and icons, the system seemed to accurately track my gaze, opening up the corresponding application with ease. However, this was likely due to the fact that I had already set up the feature for use in this specific area.

One of the most frustrating aspects of using the eye tracking feature is its reliance on calibration. In order to use the feature properly, you need to recalibrate it multiple times, which can be a bit tedious. However, once calibrated, the system seems to work seamlessly. It's almost as if the feature has a " learning curve" that requires time and practice to master.

The potential for the eye tracking feature is vast, but it also raises some interesting questions about the future of device interaction. Will we see more devices incorporating this technology? How will it change the way we interact with our devices? These are just a few of the many questions that come to mind when considering the implications of the eye tracking feature.

In conclusion, while the eye tracking feature is still in its early stages, I'm excited to see where it will go from here. With continued refinement and improvement, this technology has the potential to revolutionize the way we interact with our devices. Until then, I'll be keeping a close eye on its development, eager to see how it will continue to evolve and improve over time.

iPadOS 18 Features That Went Under the Radar

If you're one of my regular viewers, you know that I'm always on the lookout for hidden features and gems in iOS. In this video, I'm going to share some of the lesser-known features that Apple didn't even mention at WWDC. These are the kinds of features that can make or break your experience with iOS 18.

From what I've seen so far, it's clear that Apple is pushing the boundaries of what's possible on their devices. With features like eye tracking and more, they're setting a new standard for innovation and accessibility. Whether you're a seasoned power user or just starting to explore the world of iOS, there's something here for everyone.

As always, I want to hear from you! Leave a comment below with your thoughts on the latest iOS 18 features, and be sure to subscribe for more videos like this one. Don't forget to hit that notification bell so you can stay up-to-date on all my latest content. Thanks for watching, and I'll catch you in the next video!

"WEBVTTKind: captionsLanguage: enso iPad OS 18 brought a bunch of brand new features especially some under the hood that we're going to touch on in a future video but there's one feature specifically that I want to get into and that's going to be in the accessibility settings and it's going to be ey tracking so without further Ado let's talk about ey tracking let's actually demo it live with you guys and show you exactly what it could do and what it actually means for the future of the iPad and how you actually interact with it moving forward let's get into it so if you live here in the US and you've had the opportunity to buy yourself a Vision Pro or even just go to the Apple Store and demo it the first thing that happens is an eye calibration on the Vision Pro and that's where the magic is when it comes to the Vision Pro and how you interact with it and how you're actually able to touch icons and do things with your eyes which again is part of that magic sauce that comes with vision OS and The Vision Pro and it's part of the reason why it's so expensive because it's so kind of bleeding edge on the technology side so Apple what they're doing is for both iOS and iPad OS 18 they brought ey tracking over through the accessibility settings and it's kind of the same thing because it is using the face ID sensors to get you into the eye tracking mode and to help you calibrate the iPad specifically here in this video but to also calibrate your iPhone on iOS 18 so in order to get this mode going all you have to do is go into your settings go into accessibility go into the eye tracking portion and then turn it on the first time you turn it on it's going to walk you through this calibration process where you're looking at little circles on the actual iPad screen itself and it's going to make you follow them with your eyes so that way that is how it's actually calibrating based on your eyes and your movement and what you're looking at so it's going to ask you to look at the top left corner the middle the top middle top right bottom right bottom middle and walk you through the setup process to make sure it's calibrated as best as possible now again we're on iPad OS 18 beta 1 so it's not going to be perfect if anything it's so pretty wonky but it will give you an idea of what it looks like so now that we're fully calibrated let's go to the iPad walk through the rest of the settings and then see exactly how this thing actually works and see if it's worth it and see what it means moving forward so I'm going to do my best here to kind of illustrate what's going on on and right now it's kind of going crazy because my eyeballs are going crazy but once you're done with the setup process this is what it looks like it's almost like a giant size version of the mouse cursor that's running around and then when you actually go on something that's clickable it'll try to snap to that now you can actually change up a couple of the settings for instance you can change up the smoothing here which allows it to go a little bit slower which I kind of like to keep it a little bit higher that way you can kind of get an idea of what's going on but again as of right now it's not working too well like right now I am looking at something over like the camera on the left hand side so it's picking up that I'm kind of looking over there but not really and it's not really letting me click on it so as of right now I wouldn't even consider this really usable and then also if you actually are using this as an accessibility setting where you are not able to actually interact with the iPad you're going to have to interact with this over here with your actual eyesight and your eye tracking so here you can see it dwells and then I can see I clicked on there and then if I go to the camera if I see it I can't even get over to it maybe if I look up a little bit so see it doesn't even pick it up so I would have to go back in here here and the only dwell that's working right now is if I go into my notification center maybe it lets me go to device and lock rotation as you can see the dwell is kind of working there but overall it's just not super usable unfortunately but you can see this being very cool like I have my home screen here I'm kind of looking at Affinity photo and as you can see as I'm looking at Affinity photo it's trying to go down to YouTube studio so I do think you going to have to calibrate this a few times and then if I rotate it will it calibrate again as you can see every time you rotate it it wants you to recalibrate itself a little bit to actually see what's going on so if I recalibrate it I'm not going to walk through that whole process but it is recalibrating me right now so as you can see this eye tracking situation and feature still has a ways to go it's still not perfect again we're in beta 1 and it's a little bit difficult I guess for the eye tracking to really take place to see exactly what's going on with just a single camera and whatever face ID sensors they have in there when it comes to the Vision Pro there's a bunch of different sensors a bunch of incing cameras that are constantly tracking your eyeballs and every single movement that come along with it that's why on the vision Pro it's like 95% precise whenever you're looking at something and you want to actually click it and then also what I like about the Vision Pro is how you actually physically interact with it you actually use your fingers to actually click on things on the screen or whatever you're looking at forance on the iPad and the iPhone you're using this dwell situation which basically means that in order to click on an option or click on a button or click on an application you have to look at that actual application for a certain amount of time and dwell on it and then it'll open up that application or open up that feature or whatever the case may be so there's still ways to go I do think that Apple should bring over the feature of tapping your fingers in order to actually click on something but I think that has to do with sensors outside of the actual Vision Pro and it probably doesn't make sense to be able to do it on the iPad itself but long story short I do think this is a great step forward for accessibility for people that maybe aren't able to use the iPad how it was intended to be used as a touch first interface it kind of just opens the door for the possibilities not only from a non-accessibility standpoint but especially from an accessibility standpoint and really shows the kind of computer prowess behind the iPad Pro l or hate iPad OS 18 with the features and what it has and people thinking it's an overpowered device for the software that it has but there's some awesome stuff going on underneath the hood and this eye tracking feature really kind of shows off exactly what can be done with the iPad but leave a comment down below of what you think is this something that you would use if they do Kind of Perfect it moving forward like I said it's still a little bit wonky it's not very good at actually tracking your eyes and looking at what you need to especially on the iPad I've noticed that on iOS it's a little bit more precise maybe because it's a smaller screen and it's there's less kind of variability going on versus on the larger iPad Pro there's a lot more room for error because of how big the screen is but that's just my kind of quick guess without actually knowing exactly what's going on but let me know in the comment is this something you're going to use is it something you want to be able to use when it does get perfected or is this kind of another accessibility feature that will kind of stay buried in the settings for you and you won't touch but that's going to do it for this video If you guys made it to the end leave a little dolphin in the comments down below so I know you made it to the end and definitely stay subscribed or get subscribed because I have a great video talking about some hidden iPad OS 18 features that went under the radar that Apple didn't even talk about at all that I think you guys should know about but that's going to do it for this video if you guys want to watch more videos like this click on one of these right here and until next time I'm Fernando and I'm out everybody peaceso iPad OS 18 brought a bunch of brand new features especially some under the hood that we're going to touch on in a future video but there's one feature specifically that I want to get into and that's going to be in the accessibility settings and it's going to be ey tracking so without further Ado let's talk about ey tracking let's actually demo it live with you guys and show you exactly what it could do and what it actually means for the future of the iPad and how you actually interact with it moving forward let's get into it so if you live here in the US and you've had the opportunity to buy yourself a Vision Pro or even just go to the Apple Store and demo it the first thing that happens is an eye calibration on the Vision Pro and that's where the magic is when it comes to the Vision Pro and how you interact with it and how you're actually able to touch icons and do things with your eyes which again is part of that magic sauce that comes with vision OS and The Vision Pro and it's part of the reason why it's so expensive because it's so kind of bleeding edge on the technology side so Apple what they're doing is for both iOS and iPad OS 18 they brought ey tracking over through the accessibility settings and it's kind of the same thing because it is using the face ID sensors to get you into the eye tracking mode and to help you calibrate the iPad specifically here in this video but to also calibrate your iPhone on iOS 18 so in order to get this mode going all you have to do is go into your settings go into accessibility go into the eye tracking portion and then turn it on the first time you turn it on it's going to walk you through this calibration process where you're looking at little circles on the actual iPad screen itself and it's going to make you follow them with your eyes so that way that is how it's actually calibrating based on your eyes and your movement and what you're looking at so it's going to ask you to look at the top left corner the middle the top middle top right bottom right bottom middle and walk you through the setup process to make sure it's calibrated as best as possible now again we're on iPad OS 18 beta 1 so it's not going to be perfect if anything it's so pretty wonky but it will give you an idea of what it looks like so now that we're fully calibrated let's go to the iPad walk through the rest of the settings and then see exactly how this thing actually works and see if it's worth it and see what it means moving forward so I'm going to do my best here to kind of illustrate what's going on on and right now it's kind of going crazy because my eyeballs are going crazy but once you're done with the setup process this is what it looks like it's almost like a giant size version of the mouse cursor that's running around and then when you actually go on something that's clickable it'll try to snap to that now you can actually change up a couple of the settings for instance you can change up the smoothing here which allows it to go a little bit slower which I kind of like to keep it a little bit higher that way you can kind of get an idea of what's going on but again as of right now it's not working too well like right now I am looking at something over like the camera on the left hand side so it's picking up that I'm kind of looking over there but not really and it's not really letting me click on it so as of right now I wouldn't even consider this really usable and then also if you actually are using this as an accessibility setting where you are not able to actually interact with the iPad you're going to have to interact with this over here with your actual eyesight and your eye tracking so here you can see it dwells and then I can see I clicked on there and then if I go to the camera if I see it I can't even get over to it maybe if I look up a little bit so see it doesn't even pick it up so I would have to go back in here here and the only dwell that's working right now is if I go into my notification center maybe it lets me go to device and lock rotation as you can see the dwell is kind of working there but overall it's just not super usable unfortunately but you can see this being very cool like I have my home screen here I'm kind of looking at Affinity photo and as you can see as I'm looking at Affinity photo it's trying to go down to YouTube studio so I do think you going to have to calibrate this a few times and then if I rotate it will it calibrate again as you can see every time you rotate it it wants you to recalibrate itself a little bit to actually see what's going on so if I recalibrate it I'm not going to walk through that whole process but it is recalibrating me right now so as you can see this eye tracking situation and feature still has a ways to go it's still not perfect again we're in beta 1 and it's a little bit difficult I guess for the eye tracking to really take place to see exactly what's going on with just a single camera and whatever face ID sensors they have in there when it comes to the Vision Pro there's a bunch of different sensors a bunch of incing cameras that are constantly tracking your eyeballs and every single movement that come along with it that's why on the vision Pro it's like 95% precise whenever you're looking at something and you want to actually click it and then also what I like about the Vision Pro is how you actually physically interact with it you actually use your fingers to actually click on things on the screen or whatever you're looking at forance on the iPad and the iPhone you're using this dwell situation which basically means that in order to click on an option or click on a button or click on an application you have to look at that actual application for a certain amount of time and dwell on it and then it'll open up that application or open up that feature or whatever the case may be so there's still ways to go I do think that Apple should bring over the feature of tapping your fingers in order to actually click on something but I think that has to do with sensors outside of the actual Vision Pro and it probably doesn't make sense to be able to do it on the iPad itself but long story short I do think this is a great step forward for accessibility for people that maybe aren't able to use the iPad how it was intended to be used as a touch first interface it kind of just opens the door for the possibilities not only from a non-accessibility standpoint but especially from an accessibility standpoint and really shows the kind of computer prowess behind the iPad Pro l or hate iPad OS 18 with the features and what it has and people thinking it's an overpowered device for the software that it has but there's some awesome stuff going on underneath the hood and this eye tracking feature really kind of shows off exactly what can be done with the iPad but leave a comment down below of what you think is this something that you would use if they do Kind of Perfect it moving forward like I said it's still a little bit wonky it's not very good at actually tracking your eyes and looking at what you need to especially on the iPad I've noticed that on iOS it's a little bit more precise maybe because it's a smaller screen and it's there's less kind of variability going on versus on the larger iPad Pro there's a lot more room for error because of how big the screen is but that's just my kind of quick guess without actually knowing exactly what's going on but let me know in the comment is this something you're going to use is it something you want to be able to use when it does get perfected or is this kind of another accessibility feature that will kind of stay buried in the settings for you and you won't touch but that's going to do it for this video If you guys made it to the end leave a little dolphin in the comments down below so I know you made it to the end and definitely stay subscribed or get subscribed because I have a great video talking about some hidden iPad OS 18 features that went under the radar that Apple didn't even talk about at all that I think you guys should know about but that's going to do it for this video if you guys want to watch more videos like this click on one of these right here and until next time I'm Fernando and I'm out everybody peace\n"