Apple Uses Augmented Reality To Fix Your Eyes During FaceTime in iOS 13!

How Apple is Fixing the FaceTime Eye Contact Issue with iOS 13 and ARKit

Welcome everyone, it's Andrew here from Apple Insider, and one of the big issues that has plagued FaceTime users is the fact that you never seem to be really looking at the person you're talking to. With iOS 13 specifically beta 3, Apple is fixing this issue with a new feature called FaceTime attention correction.

This new feature is found on the Settings app and does exactly what it sounds like - it fixes the attention issue by correcting your eyesight using the True Depth camera system found on newer iPhones. Currently, this works with the FaceTime camera on the iPhone 10s and 10's max, which is the same camera system used for unlocking your phone and those cute animated emojis and MIMO G characters found on the Messages app.

Apple is even utilizing this ARKit functionality and the True Depth camera when making FaceTime calls. With this feature, you can apply these 3D animated emojis over top of your head and other AR features, naturally extending it to correct the eye contact issue. I'll show you what I'm talking about first if I look at the screen no list looks cam looking down but I look at the camera it looks like I'm looking straight forward that's just how things work when you look in the camera you're naturally looking up.

But now with iOS 13 and this feature toggled on, when I make a FaceTime call, when I'm looking at the screen at the person that I'm talking to, it now looks like I'm looking straight forward in augmented reality ARK is adjusting my eyes to look straight forward and when I look at the camera, it looks like I'm looking up. So you can see, I'm looking straight forward now and it looks like I'm looking naturally at the person that I'm talking to versus looking up where the camera is.

It looks like I'm looking away, but you can see how this ARKit functionality works when I pass something in front of my eyes even over the nose and from my eyes. You can see where that augmented reality is coming into play and adjusting my eyesight even though we're currently only in a beta this whole thing feels very smooth and natural if you didn't know what you were looking at, you wouldn't realize it at all literally look just like the person you're talking to is looking at you instead of the middle of their phone screen right now.

This feature is limited to the iPhone 10s and iPhone 10s Max, possibly due to API limitations. This could be using ARKit 3 which only works on the iPhone 10s, 10s max, and iPhone XR but we'll have to see how it goes at the beta process by time iOS their team is released.

So what do you guys think? Let us know down below in the comments or shout at me on Twitter at Andrew underscore OSU.

"WEBVTTKind: captionsLanguage: enhere's how Apple is fixing the FaceTime eye contact issue with iOS 13 and a our kit welcome everyone its Andrew here from Apple Insider and one of the big issue that's plagued FaceTime users of the fact that you never seem to be really looking at the person that you're talking to well with iOS 13 specifically beta 3 Apple is fixing that with a new feature called FaceTime attention correction it's a new toggle found on the Settings app and it does just what it sounds like it fixes the attention issue it corrects your eyesight using the true depth camera system found on newer iPhones currently this works with the FaceTime camera on the iPhone 10s and 10's max the same camera system used for unlocking your phone and those cue an emoji and MIMO G characters found on the messages app Apple is even using this AR kit functionality and the true death camera when making FaceTime calls where you can apply these 3d an emoji s over top of your head and other AR features it naturally makes it an extension to correct the eye contact issue I'll show you what I'm talking about first if I look at the screen no list looks cam looking down but I look at the camera it looks like I'm looking straight forward that's just how things work when you look in the camera you're naturally looking up but now with iOS 13 and that toggled on when I make a FaceTime call when I'm looking at the screen at the person that I'm talking to it now looks like I'm looking straight forward in augmented reality AR K is adjusting my eyes to look straight forward and when I look at the camera it looks like I'm looking up so you can see I'm looking straight forward now and it looks like I'm looking naturally at the person that I'm talking to versus looking up where the camera is it looks like I'm looking away you can see how this AR kit functionality works when I pass something in front of my eyes even over the nose and from my eyes you can see where it worked where that augmented reality is coming into play and adjusting my eyesight even though we're currently only in a beta this whole thing feels very smooth and natural if you didn't know what you were looking at you wouldn't realize it at all literally look just like the person you're talking to is looking at you instead of the middle of their phone screen right now this is limited to the iPhone 10s and iPhone 10s Mattox possibly due to API limitations this could be using a our kit 3 which only works on the iPhone 10s 10s max and iPhone 10 are but we'll have to see how it goes at the beta process by time iOS their team is released so what do you guys think let us know down below in the comments or shout at me on Twitter at Andrew underscore OSU hey everyone did you guys like that video be sure to click on that like button so we can create content that we know that you guys want to see and follow Apple Insider in all social media channels if you want the best prices on any Apple gear check out the Apple Insider price guide that is updated daily and until next time we'll see you laterhere's how Apple is fixing the FaceTime eye contact issue with iOS 13 and a our kit welcome everyone its Andrew here from Apple Insider and one of the big issue that's plagued FaceTime users of the fact that you never seem to be really looking at the person that you're talking to well with iOS 13 specifically beta 3 Apple is fixing that with a new feature called FaceTime attention correction it's a new toggle found on the Settings app and it does just what it sounds like it fixes the attention issue it corrects your eyesight using the true depth camera system found on newer iPhones currently this works with the FaceTime camera on the iPhone 10s and 10's max the same camera system used for unlocking your phone and those cue an emoji and MIMO G characters found on the messages app Apple is even using this AR kit functionality and the true death camera when making FaceTime calls where you can apply these 3d an emoji s over top of your head and other AR features it naturally makes it an extension to correct the eye contact issue I'll show you what I'm talking about first if I look at the screen no list looks cam looking down but I look at the camera it looks like I'm looking straight forward that's just how things work when you look in the camera you're naturally looking up but now with iOS 13 and that toggled on when I make a FaceTime call when I'm looking at the screen at the person that I'm talking to it now looks like I'm looking straight forward in augmented reality AR K is adjusting my eyes to look straight forward and when I look at the camera it looks like I'm looking up so you can see I'm looking straight forward now and it looks like I'm looking naturally at the person that I'm talking to versus looking up where the camera is it looks like I'm looking away you can see how this AR kit functionality works when I pass something in front of my eyes even over the nose and from my eyes you can see where it worked where that augmented reality is coming into play and adjusting my eyesight even though we're currently only in a beta this whole thing feels very smooth and natural if you didn't know what you were looking at you wouldn't realize it at all literally look just like the person you're talking to is looking at you instead of the middle of their phone screen right now this is limited to the iPhone 10s and iPhone 10s Mattox possibly due to API limitations this could be using a our kit 3 which only works on the iPhone 10s 10s max and iPhone 10 are but we'll have to see how it goes at the beta process by time iOS their team is released so what do you guys think let us know down below in the comments or shout at me on Twitter at Andrew underscore OSU hey everyone did you guys like that video be sure to click on that like button so we can create content that we know that you guys want to see and follow Apple Insider in all social media channels if you want the best prices on any Apple gear check out the Apple Insider price guide that is updated daily and until next time we'll see you later\n"