New Workshop! Lighting 3 | Advanced Off Camera Flash

Gear & Apps

A Primer On Phase Detection Autofocus VS. Contrast Detection

By Kishore Sawh on December 21st 2015


Not even a fortnight ago, you may have seen a review of the Sony A7 II on SLR Lounge. It’s a lovely camera by almost any measure, made into a significantly better and more capable camera recently via software updates. One of the crowning achievements of this A7II through firmware is that it stole some of the spotlight from its rather more hyped-up sibling, the a7RII, which isn’t exactly an easy thing to do, given the a7RII has bene nabbing column inches and media spots like Miley Cirus. So how did the non R version do it? Largely through autofocus.

As mentioned in the review, it’s always been a sore point for the A7 family from the get-go because it was so poor, and just seemed such a mismatch for an otherwise brilliant camera. This is all relative, of course, but the fact the A7R original didn’t have phase detection AF at all was perplexing enough that in the next evolution of the A7 line, autofocus was clearly billed as a top priority with the a7RII not only having phase detection but being the first to allow it with non-native lenses.

This was a big deal, the first in the A7RII line, and enough for many users to justify the cost. But now the A7II has it too. Anyway, it became clear after that review that there’s a lot of confusion as to what phase detection AF actually is and why it’s touted as ‘better’. In an effort to clarify that, I’d like to give you a basic explanation and then direct you to the video below from Techquickie. It’s short, and covers the major points, though quite how he managed to fit so many words into 4 minutes is beyond me.


Autofocus systems are rather complex, and truly grasping it from the ground up requires a technical expertise I neither have, nor have the time nor inclination to acquire. I assume this is true of most of you, so here’s the 5-second break-down:

Phase detection is more modern, and relies on specific sensor tech that reads information entering the lens as in phase or out of. ‘In phase’ refers to when light rays entering the camera equally illuminate opposite sides of the lens. It’s much faster than contrast detection, but requires, typically, more tech inside the camera which generally means there’s a size and cost penalty.

Contrast detection, on the other hand, is arguably more accurate than phase, it’s also less expensive tech and more diminutive, but as for right now the penalty is in speed – it’s quite a bit slower.

[REWIND: Sony A7II | Proof Size Isn’t Everything, It’s How You Use It]

Does a camera having phase detection or contrast only affect your camera purchase? We’d love to hear. Two or three years ago, it was DSLRs that had phase detection and not really mirrorless. This was a significant difference between the two systems, and still on a whole DSLRs can offer much better AF ability. But the gap between the AF speeds of DSLRs vs. mirrorless used to like the gap between Mercedes Petronas and Force India. But now is more like the one between Lewis Hamilton and Nico Rosberg.

Source: Techquickie

This site contains affiliate links to products. We may receive a commission for purchases made through these links, however, this does not impact accuracy or integrity of our content.

A photographer and writer based in Miami, he can often be found at dog parks, and airports in London and Toronto. He is also a tremendous fan of flossing and the happiest guy around when the company’s good.

Q&A Discussions

Please or register to post a comment.

  1. Dave Haynie

    Pretty good tutorial.

    The key with phase detection is that the phase calculation gives both the direction and amount of focus correction needed — that’s why it’s fast. If the lens focus calibration is known to the camera, this can happen very fast, since the camera knows exactly how much to change focus to get very, very close to the focus point. That’s why it’s been challenging to get adapted lenses focusing as fast as native lenses on many camera systems. There’s also a focus calibration on higher-end cameras, because the PDAF sensor isn’t exactly the image sensor, there can be minute differences in phase detected vs. actual focus point.

    Contrast detection, by contrast (sic), works on whatever image you actually have. So no calibration is needed. But it’s kind of like GPS… there’s no way to tell the direction you need to move in until you start moving. And as well, no way to tell you’re there until you’re right past the focus point… thus all the seeking behavior you get with CDAF. The advantage is that there’s no calibration needed.

    You’ll also find that still cameras and video cameras use different contrast detect algorithms, which is why many still cameras either don’t do active video autofocus, or do it poorly. The fast seeking you get for a still image becomes very visible in video… and while I should point out that AF in video is a bad idea anyway most of the time, people will still want it.

    Sony and perhaps a few others offer “Hybrid” autofocus. This is actually using PDAF for fast, very close focusing, and then contrast detect for the final focus tweaking. But I think that’s a fairly temporary thing. Once you have PDAF integrated on-sensor, like the Olympus OM-D E-M1 or the Sony A7 RII, tnen you have the same advantage as CDAF: the image focus is being measured at the sensor plane, not via a separate sensor. So no calibration ever needed.

    The other recent innovation is Panasonic’s Depth from Defocus technology. This is image processing technology — software — but pretty clever. They have very detailed characterizations of their DFD-supported lenses that with a modern camera DSP can analyze the defocused image and make a prediction about the amount of focusing needed. Of course, this lens data has to be in-camera, so it’s only supported by lenses that can supply this data. But it works much like a hybrid system: the image is sampled, the DFD equations applied, the lens moved very fast to a close focus, and then normal CDAF from there. It’s basically just a higher level of CDAF, but pretty cool nonetheless. And of course, it only works with recent Panasonic lenses on recent Panaonic bodies.

    | |
  2. Stephen Glass

    There was another thread about calibrating lenses recently. This shows that while that is a function of our cameras, fine tuning lenses, there are SO many other issues involved on why or why not a particular lens may or may not be focusing. It’s even more complicated then what little research I’ve done reveals.
    Now combine that with the ever increasing megapixel count and the ability that gives us to pixel peep. I agree with another post by Kishore that the whole mp race is a huge can of worms. Ultimately I think that it’ll drive the rest of the technology to a good place. In the meanwhile I often find myself looking at my images at 50%, considering their final implementation, and saying, “That’s as good as it gets.”

    | |
  3. Colin Woods

    I have often wondered about this but never deeply enough to actually look it up. Now I know, Thanks Kishore.

    | |
  4. Marco Introini

    Thanks a lot for these concise but useful information! Great work!

    | |
  5. Tommy Anderson

    great post, thank you

    | |
  6. Stephen Glass

    You da man K! Thanks, this is g reat info.

    | |
  7. Joseph Ford

    Great tutorial, his explanation is very clear and concise.

    | |