What’s the most ubiquitous camera around the world? The iPhone camera, period. While the iPhone may have struck a chord since inception, it was with the birth of the iPhone 4 where the idea that a mobile phone would be able to replace a stand-alone point & shoot became a prospect. With the iPhone 7 Plus, that prospect became a reality. As it stands today, the iPhone 7 Plus is capable of producing imagery beyond that of almost all point & shoots out there, thou$and dollar price tag or not, and in some respects beyond those too.

Now, before you scoff at the thought and search for ways to disparage my credibility, understand that I’m not suggesting it is a complete replacement for everyone’s RX100, but can your RX100 image stack 100 raw photos? Can it edit photos and send them off right in-camera? Can it shoot a sunrise, sprinkle it in dew, cover it in chocolate and a miracle or two? No, well, neither can the iPhone but you get my drift. The fact is that for most people, and for most occasions, the iPhone 7 Plus’ image capturing capability satisfies, with Portrait Mode as the nail in the coffin – and it’s about to get significantly better.

HOW APPLE VIEWS PHOTOGRAPHY | HERE’S WHAT’S COMING & IT’S MORE THAN JUST RAW CAPTURE

What’s Coming

We’ve all seen examples of the iPhone 7 Plus Portrait Mode, and whether we admit it in public, are pretty impressed by the results. It isn’t only mere mortals who are impressed either, since we’ve now seen both a Billboard Magazine cover and a cover (and spread) of ELLE Australia shot using it. The limitation, of course, is that it is only accessible through the native iOS camera app with Portrait Mode selected, but that barrier no longer exists. Hidden away during WWDC recently, Apple’s image team let it be known that they’re opening up the ability to third party apps, and that’s exciting.

shot on Portrait Mode – Fuji X-Pro2 Graphite

Not only is Apple allowing third party apps to take advantage of the depth information that can be gathered from the dual cameras, but the simulated depth is getting a general overhaul also, meaning better and more realistic background defocus rendering, more control over that defocus, and better edges.

It helps to have an understanding of depth and of how iPhones create that look of a shallow depth of field to appreciate the device and grasp what’s possibly coming with iPhones and mobile photography in general.

In the normal world, depth is considered the distance between you and an observed object, and the iPhones capture this using the disparity and parallax calculated using both lenses at once. You can get an idea of parallax by focusing on an object in front of you, and switching between closing each of your eyes. You’ll notice the object appears to move; that disparity is used to gauge depth via the two lenses working in tandem, and then a depth map is created.

In Portrait Mode, the dual cam locks onto the longer lens’ narrower point of view (56mm vs 28mm), but uses image data from both the wide and tele lenses to generate that simulated shallow depth of field via a depth map.

The depth map is a transformation of a 3 dimensional scene into a 2 dimensional representation, and it’s the backbone of Portrait Mode and what’s coming next. By opening up the CIDepthBlurEffect to developers, the captured depth information is going to be available for them to tinker with, and this has implications for camera apps, so your favorite ones may soon be able to shoot with depth, and controllable depth at that. Looking at you Lightroom Mobile, Filmborn, and Halide.


Now, this also has implications for processing and sharing apps, and a demo that was given suggests that the new CIDepthBlurEffect can be worked with to allow for various levels of after-the-fact depth control, simulating various apertures, adding filters based on depth, and even facial recognition so you can select up to 4 faces in a scene to keep in focus and it will adjust the depth map behavior.

This is where the iPhone 7 Plus and the next generation of Apple iPhones with dual lenses is going, and it just makes the platform so compelling. You really will have to go with really advanced point & shoots to compare, and then it isn’t a stretch to think Apple will soon start targeting other high-end abilities like auto-focus and such.

This, essentially, is the advantage of the iPhone as a platform, that it doesn’t approach photography with the limitations of mechanical optics, but of software, and software plays on a field as wide as the heavens. It’s adaptable, and it can emulate. Is it true? No, but do most care? No.

We’ll be bringing you more as it develops, and more photography app reviews and phone reviews in the near future. Next up is Halide, and Affinity Photo for iPad, and a review of the HuaWei Mate 9 with dual Leica lenses, but check out the other posts below: