top of page

Apple Continues Focus on Software Improvements for Cameras in iPhone 13

By Justin Kresse


Apple’s new iPhones have started using software to fake the blurry backgrounds of profes- sional cameras for photos and now video. However, the software is not quite perfect yet. Though the differences are subtle, the professional camera (Canon 6D) image is more natural. Still the iPhone portrait mode photo is much closer to the 6D photo than it is to the iPhone photo without portrait mode.

Apple recently released their new lineup of iPhones, specifically the iPhone 13 Mini, iPhone 13, iPhone 13 Pro and iPhone 13 Pro Max. To be honest, though, I wasn’t all that impressed with the changes they made. Sure, the new phones have a new processor that’s faster than last year’s models. Sure, they added a 120hz display that should be pretty smooth. And sure, there are slightly new colors that look a bit prettier. These changes are cool and all, but I don’t think they warrant someone spending $1,000 on a new phone. However, one aspect of the new phones that I was actually interested in was the cameras.

One of the biggest features that will only be available on the new iPhones is a cinematic mode. In many ways, it’s like portrait mode but for video. The feature essentially blurs the background and leaves only the subject in focus, whether that subject is a person or a water bottle. The phone can even automatically switch between people as subjects when the subject in focus looks at someone else. And the phone saves the blur separately in the file allowing you to go back later and change which subject is in focus.

The concept for this cinematic mode is definitely interesting, but just like portrait mode, I don’t think it’s quite there yet. The problem with portrait mode, and now with the cinematic mode, is that it isn’t actually using a camera that can really make blurry backgrounds. Instead, it uses software on the phones to detect what the subject is and then adds a blur filter to everything but the subject.

Quick photography lesson: most professional cameras use sensors that are 864mm² in size. This large sensor allows the camera to capture a lot of light as well as get those blurry backgrounds you see on professional portraits and other photos. The new iPhone’s wide camera, on the other hand, has a sensor size of 44mm² (and this is the biggest sensor size ever in an iPhone). With such a small sensor, the iPhone camera isn’t able to let in as much light or get nearly as blurry backgrounds. Apple has focused mainly on software to fix the problems they face with using such a small sensor. For a while, Apple has used noise reduction to reduce the noise you get from having such a small sensor that doesn’t perform well in low-light scenarios. And now, they are trying to combat the lack of blurry backgrounds by using software and “faking” the look.

I’m not trying to say that Apple’s approach is bad. It’s not feasible for Apple to put a big sensor in their phones that are even close to a DSLR’s sensor size. That would be way too big and would require a bigger lens, more battery life, and would make the phone a great deal heavier. Because of this, they’ve chosen to fake the professional look with software.

Apple has been able to fix most of the problems from this small sensor relatively well with their software, but I don’t think portrait mode and cinematic mode are there yet. The problem is that the cameras aren’t able to accurately blur the background: they just add a generic blur to the whole thing. With a regular camera, you will have the subject in focus with objects getting more out of focus the farther away from the subject. The portrait and cinematic mode also have a hard time with complicated subjects, such as people with messy hair.

Just because I think portrait mode and cinematic mode aren’t there yet doesn’t mean that I’m not excited to see what it will be like in the future. With the inclusion of LiDAR (Light Detection and Ranging – a system that allows the phone to see how far away objects are from the camera) on the new iPad Pro and now the iPhone 13 Pro and iPhone 13 Pro Max, the devices could theoretically use depth perception to determine how much blur certain objects should receive based on how far away they are from the subject. LiDAR could also potentially help distinguish the subject from the background more accurately.

In general, technology will continue to evolve, so we can expect portrait mode and cinematic mode and whatever mode Apple comes up with next to eventually be indistinguishable from a full-frame camera. I’m excited for that day to come, but until it does, I’ll still be lugging my DSLR around with me when I want to take professional-looking photos.


4 views0 comments
bottom of page