Tag Archives: line spread function

DOF and Diffraction: Setup

The two-thin-lens model for precision Depth Of Field estimates described in the last two articles is almost ready to be deployed.  In this one we will describe the setup that will be used to develop the scenarios that will be outlined in the next one.

The beauty of the hybrid geometrical-Fourier optics approach is that, with an estimate of the field produced at the exit pupil by an on-axis point source, we can generate the image of the resulting Point Spread Function and related Modulation Transfer Function.

Pretend that you are a photon from such a source in front of a f/2.8 lens focused at 10m with about 0.60 microns of third order spherical aberration – and you are about to smash yourself onto the ‘best focus’ observation plane of your camera.  Depending on whether you leave exactly from the in-focus distance of 10 meters or slightly before/after that, the impression you would leave on the sensing plane would look as follows:

Figure 1. PSF of a lens with about 0.6um of third order spherical aberration focused on 10m.

The width of the square above is 30 microns (um), which corresponds to the diameter of the Circle of Confusion used for old-fashioned geometrical DOF calculations with full frame cameras.  The first ring of the in-focus PSF at 10.0m has a diameter of about 2.44\lambda \frac{f}{D} = 3.65 microns.   That’s about the size of the estimated effective square pixel aperture of the Nikon Z7 camera that we are using in these tests.
Continue reading DOF and Diffraction: Setup

DOF and Diffraction: Image Side

This investigation of the effect of diffraction on Depth of Field is based on a two-thin-lens model, as suggested by Alan Robinson[1].  We chose this model because it allows us to associate geometrical optics with one lens and Fourier optics with the other, thus simplifying the underlying math and our understanding.

In the last article we discussed how the front element of the model could present at the rear element the wavefront resulting from an on-axis source as a function of distance from the lens.  We accomplished this by using simple geometry in complex notation.  In this one we will take the relative wavefront present at the exit pupil and project it onto the sensing plane, taking diffraction into account numerically.  We already know how to do it since we dealt with this subject in the recent past.

Figure 1. Where is the plane with the Circle of Least Confusion?  Through Focus Line Spread Function Image of a lens at f/2.8 with the indicated third order spherical aberration coefficient, and relative measures of ‘sharpness’ MTF50 and Acutance curves.  Acutance is scaled to the same peak as MTF50 for ease of comparison and refers to my typical pixel peeping conditions: 100% zoom, 16″ away from my 24″ monitor.

Continue reading DOF and Diffraction: Image Side

DOF and Diffraction: Object Side

In this and the following articles we shall explore the effects of diffraction on Depth of Field through a two-lens model that separates geometrical and Fourier optics in a way that keeps the math simple, though via complex notation.  In the process we will gain a better understanding of how lenses work.

The results of the model are consistent with what can be obtained via classic DOF calculators online but should be more precise in critical situations, like macro photography.  I am not a macro photographer so I would be interested in validation of the results of the explained method by someone who is.

Figure 1. Simple two-thin-lens model for DOF calculations in complex notation.  Adapted under licence.

Continue reading DOF and Diffraction: Object Side

The Units of Discrete Fourier Transforms

This article is about specifying the units of the Discrete Fourier Transform of an image and the various ways that they can be expressed.  This apparently simple task can be fiendishly unintuitive.

The image we will use as an example is the familiar Airy Pattern from the last few posts, at f/16 with light of mean 530nm wavelength. Zoomed in to the left in Figure 1; and as it looks in its 1024×1024 sample image to the right:

Airy Mesh and Intensity
Figure 1. Airy disc image I(x,y). Left, 1a, 3D representation, zoomed in. Right, 1b, as it would appear on the sensing plane (yes, the rings are there but you need to squint to see them).

Continue reading The Units of Discrete Fourier Transforms

COMBINING BAYER CFA MTF Curves – II

In this and the previous article I discuss how Modulation Transfer Functions (MTF) obtained from the raw data of each of a Bayer CFA color channel can be combined to provide a meaningful composite MTF curve for the imaging system as a whole.

There are two ways that this can be accomplished: an input-referred approach (L) that reflects the performance of the hardware only; and an output-referred one (Y) that also takes into consideration how the image will be displayed.  Both are valid and differences are typically minor, though the weights of the latter are scene, camera/lens, illuminant dependent – while the former are not.  Therefore my recommendation in this context is to stick with input-referred weights when comparing cameras and lenses.1 Continue reading COMBINING BAYER CFA MTF Curves – II

MTF Mapper vs sfrmat3

Over the last couple of years I’ve been using Frans van den Bergh‘s excellent open source MTF Mapper to measure the Modulation Transfer Function of imaging systems off a slanted edge target, as you may have seen in these pages.  As long as one understands how to get the most out of it I find it a solid product that gives reliable results, with MTF50 typically well within 2% of actual in less than ideal real-world situations (see below).  I had little to compare it to other than to tests published by gear testing sites:  they apparently mostly use a commercial package called Imatest for their slanted edge readings – and it seemed to correlate well with those.

Then recently Jim Kasson pointed out sfrmat3, the matlab program written by Peter Burns who is a slanted edge method expert who worked at Kodak and was a member of the committee responsible for ISO12233, the resolution and spatial frequency response standard for photography.  sfrmat3 is considered to be a solid implementation of the standard and many, including Imatest, benchmark against it – so I was curious to see how MTF Mapper 0.4.1.6 would compare.  It did well.

Continue reading MTF Mapper vs sfrmat3

The Units of Spatial Resolution

Several sites for photographers perform spatial resolution ‘sharpness’ testing of a specific lens and digital camera set up by capturing a target.  You can also measure your own equipment relatively easily to determine how sharp your hardware is.  However comparing results from site to site and to your own can be difficult and/or misleading, starting from the multiplicity of units used: cycles/pixel, line pairs/mm, line widths/picture height, line pairs/image height, cycles/picture height etc.

This post will address the units involved in spatial resolution measurement using as an example readings from the popular slanted edge method, although their applicability is generic.

Continue reading The Units of Spatial Resolution

How to Get MTF Performance Curves for Your Camera and Lens

You have obtained a raw file containing the image of a slanted edge  captured with good technique.  How do you get the Modulation Transfer Function of the camera and lens combination that took it?  Download and feast your eyes on open source MTF Mapper version 0.4.16 by Frans van den Bergh.

[Edit, several years later: MTF Mapper has kept improving over time, making it in my opinion the most accurate slanted edge measuring tool available today, used in applications that range from photography to machine vision to the Mars Rover.   Did I mention that it is open source?

It now sports a Graphical User Interface which can load raw files and allow the arbitrary selection of individual edges by simply pointing and clicking, making this post largely redundant.  The procedure outlined will still work but there are easier ways to accomplish the same task today.  To obtain the same result with raw data and version 0.7.38 just install MTF Mapper, set the “Settings/Preferences” tab as follows and leave all else at default:

“Pixel size” is only needed to also show SFR in units of lp/mm and the “Arguments” field only if using an unspecified raw data CFA layout.  “Accept” and “File/Open with manual edge selection” your raw files.  Follow the instructions to select as many edges as desired.  Then in “Data set” open an “annotated” file and shift-click on the chosen edges to see the relative MTF plots.]

The first thing we are going to do is crop the edges and package them into a TIFF file format so that MTF Mapper has an easier time reading them.  Let’s use as an example a Nikon D810+85mm:1.8G ISO 64 studio raw capture by DPReview so that you can follow along if you wish.   Continue reading How to Get MTF Performance Curves for Your Camera and Lens

The Slanted Edge Method

My preferred method for measuring the spatial resolution performance of photographic equipment these days is the slanted edge method.  It requires a minimum amount of additional effort compared to capturing and simply eye-balling a pinch, Siemens or other chart but it gives more, useful, accurate, quantitative information in the language and units that have been used to characterize optical systems for over a century: it produces a good approximation to  the Modulation Transfer Function of the two dimensional camera/lens system impulse response – at the location of the edge in the direction perpendicular to it.

Much of what there is to know about an imaging system’s spatial resolution performance can be deduced by analyzing its MTF curve, which represents the system’s ability to capture increasingly fine detail from the scene, starting from perceptually relevant metrics like MTF50, discussed a while back.

In fact the area under the curve weighted by some approximation of the Contrast Sensitivity Function of the Human Visual System is the basis for many other, better accepted single figure ‘sharpness‘ metrics with names like Subjective Quality Factor (SQF), Square Root Integral (SQRI), CMT Acutance, etc.   And all this simply from capturing the image of a slanted edge, which one can actually and somewhat easily do at home, as presented in the next article.

Continue reading The Slanted Edge Method