People buy new smartphones for many reasons: some for the apps they can run, others for the ability to watch videos and play games, but one feature that drives many to upgrade is the camera. All smartphone makers work hard to improve their cameras to entire users to opt for newer devices, and Apple has done this for years. With this year’s iPhone models – the iPhone XS, XS Max, and XR – Apple has brought new possibilities to the camera. (Read our review of the iPhone XS Max here.) But it’s not just the sensors or lenses that change; the real innovation these days is in the software that creates photos called computational photography.
You’ve all seen what happens when you try to take a photo of a friend against a bright background with a standard camera. If the camera calculates the exposure based on the overall lighting of a scene, the background will likely be too light and your friend too dark. Or if it exposes for your friend’s face, they’ll be at the right level, but the background will be too bright and the photo will not look great.
For this reason, photographers use a technique called HDR, or high dynamic range, to compensate for this. With a standard digital camera, this generally requires that you shoot several photos – at least two – exposing one for your friend’s face, another for the background, then merging them in special software.
The iPhone has long had an HDR feature where it would shoot multiple exposures and merge them automatically. You could turn on this Auto-HDR feature on your device and also tell it to keep the original photo in case you also want to edit that later.
This year’s iPhones use what Apple calls Smart HDR and the devices shoot four frames at different exposures. The camera’s CPU also creates “inter frames” at different exposures. Apple uses a lot of fancy words to describe what happens next, talking about a neural engine and machine learning, but what it comes down to is the fact that the new iPhone has an astonishing ability to combine photos so they balance out the dark and light elements. Of course, you may want your photos to have more contrast between the foreground and background, and you can still keep the original photo if you wish.
Here are some examples of photos I shot with my iPhone 8 Plus and my iPhone XS Max. With both phones, I left the setting on to save the original photo as well. First is the iPhone 8 Plus, a standard, non-HDR photo shot with the telephoto lens. As you can see, the sky is washed out; there’s no blue visible at all:
But the 8 Plus’s HDR made a much better photo:
Not only is the sky blue, but the gradations between dark and light on the grass and the brick wall are much more subtle.
With the iPhone XS Max, the difference is stunning. Here’s the same scene shot with the wide-angle lens:
And here’s the HDR photo that the XS Max created:
The first thing to note is how much better the non-HDR photo is. I shot this scene with both iPhones, using both the wide-angle and telephoto lenses. I chose to show you the XS Max’s photos with the wide-angle lenses because the photos shot with the telephoto lens are so good that even the non-HDR photo looks fine. In other words, even without smart HDR, the iPhone XS makes photos as good as the 8 Plus’s HDR mode.
The iPhone decides whether it needs to use HDR depending on the scene. The 8 Plus didn’t think that the telephoto shot needed HDR and this is what it made; note the blown out highlights in the bottom of the sky:
In other words, you can’t force the camera to shoot HDR photos; it decides when it needs them. In all cases, the iPhone XS Max has outperformed the 8 Plus and in many cases by a significant margin.
The other feature available in these new iPhones is portrait mode. This has been around since the iPhone 7 Plus and uses the two cameras – one wide angle and one telephoto – to create a photo with two layers. It detects the foreground subject, usually a person, and separates that from the background. This allows you to apply a background blur setting to make the subject stand out or to apply one of a number of lighting settings that alter the lighting on the subject.
With the iPhone XS and XS Max, you can adjust the amount of blur as you would on a standard camera by changing the aperture. Larger apertures – measured by lower “f stop” numbers – result in more background blur.
Here are some examples. First is a photo shot in portrait mode. In the first, portrait mode is on; in Edit mode, tap the word Portrait on the iPhone to turn it off as you see in the second photo.
You can see the depth lines below the photo. If you drag them to the left or right, you can adjust the background blur, increasing it or decreasing it as you wish. (Remember higher f stop numbers mean less blur.)
Both of these improvements to the iPhone camera are stunning, but it’s Smart HDR that makes the biggest difference. Not only is the overall lighting of photos much better, but the colors are more alive as well. It’s worth noting that the less expensive iPhone XR also has smart HDR, and it even has a version of portrait mode, which, while not as good as the iPhone XS and XS Max due to the fact that it only has one camera, is excellent. These new iPhones are expensive, but if you use the camera a lot, you’ll find the quality of your photos will be much better than with previous models.