Skip to main content

Deep Fusion differences sometimes subtle, but really show up in pet photos

iOS 13.2 landed earlier this week, bringing with it Deep Fusion, the feature Apple’s marketing chief described in the iPhone 11 keynote as ‘computational photography mad science.’ (Less happily, it also seemed to introduce problems with background apps.)

Like Smart HDR, Deep Fusion blends multiple frames to create the best possible composite image. But while Smart HDR is limited to blending different exposures, some designed to capture shadow areas, others to pull in highlights, Deep Fusion also aims to bring out more detail…

Here’s how Schiller described the feature, which is available only on the three iPhone 11 models.

It shoots nine images, before you press the shutter button it’s already shot four short images, four secondary images. When you press the shutter button it takes one long exposure, and then in just one second, the Neural Engine analyzes the fused combination of long and short images picking the best among them, selecting all the pixels, and pixel by pixel, going through 24 million pixels to optimize for detail and low noise, like you see in the sweater there. It’s amazing, this is the first time a Neural Processor is responsible for generating the output image. It is computational photography mad science.

Gizmodo’s Adam Estes has been testing the beta version for several weeks. One of the problems he faced that there’s no user indication of when Deep Fusion is active (though there are situations in which it definitely isn’t). You just have to take photos on two identical phones, one with iOS 13.2, one without, and look for differences.

The bad news is that, in many shots, the differences are only visible when pixel-peeping. They are not likely to make a noticeable difference to most people’s perception of a photo, he suggests.

Let’s dig into what Deep Fusion’s computation photography mad science really feels like. If I’m being honest, it doesn’t feel like much. Right after the Deep Fusion feature appeared on the iOS 13.2 public beta, I installed the software on Gizmodo’s iPhone 11 Pro, while I kept the previous iOS version, the one without Deep Fusion on my iPhone 11 Pro. Then I just took a crapload of pictures in all kinds of different environments. Frankly, I often couldn’t tell the difference between the Deep Fusion shot and the non-Deep Fusion shot.

Sometimes, though, the difference is immediately obvious. Although Estes describes as ‘subtle’ the difference between two shots of the clock in the middle of New York’s Grand Central Terminal, to me the difference is very clear even before viewing it full size.

Whether the Deep Fusion shot is better is a subjective judgment. It is certainly sharper, but perhaps borders on over-sharpened. Apps like Photoshop and Lightroom allow you to sharpen photos in post-production, and sometimes the result can look artificial if someone goes too far – and I would say that this is dangerously close to this territory. To the average person, however, sharper is ‘better,’ so Apple probably made the right call there.

Estes says there’s one case where the difference is night and day: photos of his puppy, Peanut, seen here in a close-up view.

Things changed for me when I started taking photos of fur, however. In theory, this is the exact sort of scenario where Deep Fusion should shine, since tiny strands of hair tend to blur together, but a neural engine could identify these details and merge them together into a Deep Fusion photo. This might be why Apple chose to use a bearded man in a finely textured jumper to show off Deep Fusion in the recent keynote. My version of a bearded man in a finely textured jumper is a little puppy named Peanut […]

While she looks angelic in both of these photos, it’s fairly easy to see that, in the photo on the left, her gentle little highlights get blurry around the crown of her head and around her ear. In the Deep Fusion photo on the right, they’re crisp as can be […]

I never want to take another photo of Peanut without Deep Fusion again.

What we’re seeing today is, of course, only the first iteration of Deep Fusion, and Estes speculates that future updates will improve the zoom capabilities, much as Google has with the Pixel 4.

Are you noticing significant differences in photos before and after updating to iOS 13.2? Please let us know in the comments.

Check out the full piece here.

FTC: We use income earning auto affiliate links. More.

Apple iPhone 11 case deals on Amazon
You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Check out 9to5Mac on YouTube for more Apple news:

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear