Seeing What's in Focus

An InBox question made me realize that no one is really helping you with regards to depth of field these days.

Of course, no one was really helping you before, either. ;~)

Let’s start with the film era. The lens had a focus scale on it, usually with DOF markings. The “drill” was to set your exposure so you knew your aperture, then look at the ring on the lens and see what “was in focus.”

Uh, no.

First off, that focus scale was produced via some form of Zeiss-style calculations. The Zeiss method is just one of several theories about depth of field, and is based upon likely perception of differences by a human with certain eyesight at a certain distance on a particular-sized print. As if to rub salt in the wound, the Japanese lens makers used wierd rounded numbers, too. Canon for a long time used a Circle of Confusion of 0.35. Others used 0.33 or just 0.3. Some made clear math errors in their calculations at times. The Zeiss method technically should have you using 0.25 for 20/40 vision when viewing a smallish print at a modest distance. 

Now you can see why I wrote “no one was helping you before.”

But it gets worse. 

With the advent of low dispersion glass and phase detect autofocus, another thing happened: lenses started moving the focus plane slightly depending upon the temperature. If there was a scale on the lens, you might have noticed that it went past infinity. That’s because at certain temperatures, actual focused infinity was past the infinity mark on the lens. Phase detect didn’t care what the distance on the lens said, it just moved the lens so that focus was “right."

Of course, the camera makers were generous: they gave you a depth of field button on the camera so you could see a really, really dim view of the scene in a coarse focus screen which might or might not tell you what was in focus. If your eyes could adjust; which landscape photographers in bright light almost never could do. 

With helpers like the lens and camera makers, who needs enemies? ;~)

Then came DSLRs, and some new variables came into play. With the older, low pixel count cameras, you might not be resolving enough to use Zeiss-level precision, particularly since some of the early demosaic routines also did pixel assessments well beyond the Circle of Confusion Zeiss implied you should use. 

The question that came in via email was regarding using laser rulers to figure out distances, and how you’d apply what they said on the camera. 

Forgidaboutit. You bought a mirrorless camera. You have the answer already.

It’s the reason why we all want higher resolution EVFs, actually. My cameras are programmed to have a control that’s allows me to zoom in and look at focus directly at 100%. On the Nikon bodies, the lens is already stopped down if I’m using any aperture between f/1.2 to f/5.6. If I’m at f/6.3 or higher, I first flip the switch to video mode to see the active aperture results, which are honored for video. If that's not enough, I can program a button for DOF Preview. Either way, I get to look at the pixel level results in a nice bright viewfinder that can be zoomed, scrolled, and seen in any light. (Other camera makers treat apertures differently, but all the mirrorless cameras have a way of evaluating DOF in the viewfinder with magnification.)

Of course, evaluating the near and far focus point is something you'll have to practice and for which you need to develop your own standards for what’s acceptable. Still, we're better off with current mirrorless cameras than we've ever been, at least if you can trust your eyes.

Looking for gear-specific information? Check out our other Web sites:
DSLRS: | general:| Z System: | film SLR:

sansmirror: all text and original images © 2023 Thom Hogan
portions Copyright 1999-2022 Thom Hogan
All Rights Reserved — the contents of this site, including but not limited to its text, illustrations, and concepts, 
may not be utilized, directly or indirectly, to inform, train, or improve any artificial intelligence program or system.