iPhone 7 Plus 'Portrait Mode' Hands-On Preview!

When the iPhone 7 Plus was announced with its dual camera, one of the features Apple focused on was a new portrait mode that would give you a depth of field effect similar to what you get in a higher-end camera like a DSLR. However, this feature did not ship with the phone.

Now, the iOS 10.1 beta has been released and it comes with an early look at this portrait feature. To access this, you go into the camera app and select from the bottom the portrait mode. This is added to the other modes that are in the camera now. Once in the mode, the camera switches to the 56 millimeter telephoto lens, which is what you will use to take the images in order to get the effect that we want. In a traditional camera, this separation of foreground and background is achieved using a dedicated optical system.

Apple is using two cameras to gather data and create a depth map. There's a lot of work going into this, like using facial recognition and using the slight differences in what the two cameras see. The different levels in the depth map are then used to apply a blur, which will allow for a more gradual and changing blur rather than a single stagnant blur applied across the entire background. At least that's the hope.

To test this a little bit, I've set up a scene here with a subject and a few items in the background just to see how that blur differentiates. When using the portrait mode, it works much the same as any other picture mode, but there are a few caveats. Your subject has to be in the right place - you can't be too close or too far otherwise it's not going to work. But once you're in the correct position, the depth effect is actually rendered on the screen in real-time, so you can see a preview of what it's going to look like when you take the photo.

You just go into the mode, you can tap to focus like normal, and it will lock in. You can then see the depth effect and go ahead and take a photo, and then it renders out. It also has the normal version without the depth effects saved, so you can compare them and see how they look.

I'm going to go ahead and move this plant a little bit farther back so we can see how that works and just changes. Now you can see that since the plant is farther back, it's a little bit more blurry than it was before when it was closer up. While this works best with people because it can use facial recognition - it doesn't only have to work with faces like this one here - you can also use it with objects and it should still work although it might not be quite as good at least not yet.

So if I take a picture of the plant here, you still get the depth effect even though it is not a person, and you can definitely tell that there's a difference between the normal photo and the one with a depth effect. Now with this first beta, it really depends what situation you're shooting in when to getting a good photo. When you compare to a photo taken with a bigger sensor and an actual depth of field - instead of one that simulated - you can see where the differences are and where more work definitely needs to be done.

One of the places where you definitely see the difference is around the subject, where you get this kind of halo effect. That's because of the blur, which is probably what's going on here as well. But when you look at the photo coming from a dedicated camera - you don't get that it looks a lot more natural, and also of course in a dedicated camera there's not a limit to the amount of layers you can have when applying these dip bursts - so that's an early look at the portrait mode coming for the iPhone 7 Plus. It's still in beta and definitely needs some work but it gives us an idea of what to expect.

I hope you enjoyed this video if you did be sure to subscribe so you can see when new videos are out and visit MacRumors.com for more, be sure to let us know down below in the comments what you think of this effect do you think it looks realistic.

"WEBVTTKind: captionsLanguage: enwhen the iPhone 7 plus was announced with its dual camera one of the features Apple focused on was a new portrait mode that would give you a depth of field effect similar to what you get in a higher-end camera like a DSLR problem is this didn't ship with the phone but now the iOS 10.1 beta has been released and it comes with an early look at this portrait feature so to access this you go into the camera app and select from the bottom the portrait mode and this is added to the other modes that are in the camera now once in the mode the camera switches to the 56 millimeter telephoto lens and this is what you will use to take the images in order to get the effect that we want where we have the separation of the foreground and background in a camera this small Apple is using two cameras to gather data and create a depth map there's a lot of work going into this like using facial recognition and using the slight differences in what the two cameras see the different levels in the depth map are then used to apply a blur and this will allow for a more gradual and changing blur rather than a single stagnate blur applied across the entire background at least that's the hope so to test this a little bit I've set up a scene here with a subject and a few items in the background just to see how that blur differentiates so when using the portrait mode it works much the same as any other picture mode but there are a few caveats one your subject has to be in the right place you can't be too close or too far otherwise it's not going to work but once you are in the correct position the depth effect is actually rendered on the screen in real-time so you can see a preview of what it's going to look like when you take the photo so you just go in you can tap to focus like normal it'll lock in and you can see the depth effect go ahead and take a photo and then it renders out it also has the normal version without the depth effects saved so you can compare them and see how they look now I'm going to go ahead and move this plant a little bit farther back so we can see how that works and just changes and now you can see that since the plant is farther back it's a little bit more blurry than it was before when it was closer up now while this works best with people because it can use facial recognition it doesn't only have to work with faces like this one here you can also use it with objects and it should still work although it might not be quite as good at least not yet so if I take a picture of the plant here you still get the depth effect even though it is not a person and you can definitely tell that there's a difference between the normal photo and the one with a depth effect now right now with this first beta I really depends what situation you're shooting in when to getting a good photo now when you compare to a photo taken with a bigger sensor and an actual depth of field instead of one that simulated you can see where the differences are and where more work definitely needs to be done one of the places where you definitely see the difference is around the subject you get this kind of halo effect and that is done because of the blur if you've ever tried to apply and blur to a background in Photoshop and you cut out your subject you'll notice you'll get the same blur and that is probably what's going on here as well but when you look at the photo coming from a dedicated camera you don't get that it looks a lot more natural and also of course in a dedicated camera there's not a limit to the amount of layers you can have when applying these dip bursts it's just all throughout the range you get that blur so that is an early look at the portrait mode coming for the iPhone 7 plus it's still in beta and definitely needs some work but it gives us an idea of what to expect I hope you enjoyed this video if you did be sure to subscribe so you can see when new videos are out and visit MacRumors comm for more be sure to let us know down below in the comments what you think of this effect do you think it looks realistic I want to thank you all so much for watching i'm matt gonzales of MacRumors and i'll see you next timewhen the iPhone 7 plus was announced with its dual camera one of the features Apple focused on was a new portrait mode that would give you a depth of field effect similar to what you get in a higher-end camera like a DSLR problem is this didn't ship with the phone but now the iOS 10.1 beta has been released and it comes with an early look at this portrait feature so to access this you go into the camera app and select from the bottom the portrait mode and this is added to the other modes that are in the camera now once in the mode the camera switches to the 56 millimeter telephoto lens and this is what you will use to take the images in order to get the effect that we want where we have the separation of the foreground and background in a camera this small Apple is using two cameras to gather data and create a depth map there's a lot of work going into this like using facial recognition and using the slight differences in what the two cameras see the different levels in the depth map are then used to apply a blur and this will allow for a more gradual and changing blur rather than a single stagnate blur applied across the entire background at least that's the hope so to test this a little bit I've set up a scene here with a subject and a few items in the background just to see how that blur differentiates so when using the portrait mode it works much the same as any other picture mode but there are a few caveats one your subject has to be in the right place you can't be too close or too far otherwise it's not going to work but once you are in the correct position the depth effect is actually rendered on the screen in real-time so you can see a preview of what it's going to look like when you take the photo so you just go in you can tap to focus like normal it'll lock in and you can see the depth effect go ahead and take a photo and then it renders out it also has the normal version without the depth effects saved so you can compare them and see how they look now I'm going to go ahead and move this plant a little bit farther back so we can see how that works and just changes and now you can see that since the plant is farther back it's a little bit more blurry than it was before when it was closer up now while this works best with people because it can use facial recognition it doesn't only have to work with faces like this one here you can also use it with objects and it should still work although it might not be quite as good at least not yet so if I take a picture of the plant here you still get the depth effect even though it is not a person and you can definitely tell that there's a difference between the normal photo and the one with a depth effect now right now with this first beta I really depends what situation you're shooting in when to getting a good photo now when you compare to a photo taken with a bigger sensor and an actual depth of field instead of one that simulated you can see where the differences are and where more work definitely needs to be done one of the places where you definitely see the difference is around the subject you get this kind of halo effect and that is done because of the blur if you've ever tried to apply and blur to a background in Photoshop and you cut out your subject you'll notice you'll get the same blur and that is probably what's going on here as well but when you look at the photo coming from a dedicated camera you don't get that it looks a lot more natural and also of course in a dedicated camera there's not a limit to the amount of layers you can have when applying these dip bursts it's just all throughout the range you get that blur so that is an early look at the portrait mode coming for the iPhone 7 plus it's still in beta and definitely needs some work but it gives us an idea of what to expect I hope you enjoyed this video if you did be sure to subscribe so you can see when new videos are out and visit MacRumors comm for more be sure to let us know down below in the comments what you think of this effect do you think it looks realistic I want to thank you all so much for watching i'm matt gonzales of MacRumors and i'll see you next time\n"