A number of interesting insights come to light once one realizes that as far as the slanted edge method (of measuring the Modulation Transfer Function of a Bayer CFA digital camera and lens from its raw data) is concerned it is as if it were dealing with identical images behind three color filters, each in their own separate, full resolution color plane:
Because each color filter lets through a different mix of wavelengths each sensing plane will see an image with slightly different spatial characteristics. The differences will be evident in wavelength dependent effects like diffraction and chromatic aberrations. We dealt with the effect of the spectral sensitivity of CFA filters on diffraction in an earlier post, we will take a closer look at lateral and longitudinal Chromatic Aberrations (CA) in this one.
Chromatic Aberrations: Shifts in xy and z
Lateral (or Transverse) CA is a result of imperfect optics which are unable to place different wavelengths exactly in the same place on the sensing plane. The edge will be slightly shifted and magnified up or down, left or right (along the xy axes) in the images of the three color planes depending on its distance from the center, as hinted to by the different position of the gray edge in Figure 1. If the raw color channels are used as-is to demosaic an image in a standard RGB color space they will produce the well known reddish-greenish-bluish fringes around areas of high contrast. Most lenses are designed to keep Lateral CA under control but because the three color planes are captured separately in the raw data it is relatively easy to re-size and re-align them before assembling them into a rendered RGB image. Therefore Lateral CA can often be partly compensated for during raw conversion.
Longitudinal (or Axial) Chromatic Aberration will cause the edges in the three color planes to come to focus at slightly different distances (the z axis) from the lens principal plane, as hinted to by the blue arrows in Figure 1. Assuming that the sensor is positioned so that the green channel is perfectly in-focus, both the red and the blue color planes will see very slightly out of focus images, reducing the overall sharpness of the final photograph and producing effects like purple fringing around tree branches against a bright sky.
How far out of focus will depend on how well the lens has been corrected for Longitudinal CA. Unfortunately there is no easy way to recover the spatial information lost to defocus without an accurate depth map of the scene, which we normally don’t have, so there is very little or nothing one can do to minimize the effects of Longitudinal CA during raw conversion, which makes having a well corrected lens in the first place a very desirable feature.
Measuring Chromatic Aberrations via MTF50
Frans van den Bergh has provided as part of his excellent open source MTF Mapper a test chart that will allow one to determine best focus and Longitudinal CA from a single capture.
Jim Kasson has independently performed a marvelous series of tests documenting this effect for a number of mid-tele primes, I highly recommend the read. He takes many raw captures of a slanted edge chart in manual focus mode by moving the camera 4 mm towards the target each time without touching focus, thereby effectively walking it through back focus, perfect focus and front focus. He used MTF Mapper to extract the MTF performance of the individual raw color planes and graciously let me have some of his data. Here are the MTF50s of 49 such captures from a Sony a7RII coupled with an FE55mm at f/1.8 in the center of the field of view:
There’s Longitudinal CA in all its glory. It looks like this lens is well corrected for blue but less well for red. Apparently this is typical. The question that immediately popped into my mind was where should the camera focus for best overall sharpness, hence the journey described in the last three articles. For instance we would not want it to focus based on the peak of the red plane.
It turns out that most cameras use the green color plane for focusing these days. One of the two other color planes in well corrected lenses should not be too far off, while the third one will probably be somewhat defocused as above, dragging down system MTF.
This has interesting implications for the perceived sharpness of the displayed final image. For instance the raw capture of a ColorChecker 24 target in the Studio Scene by DPReview’s Sony 7RII with FE55/1.8 lens under D65 lighting suggests the following transformation for white-balanced raw data to the connection color space
(1)
This is the luminance/grayscale/luminosity/etc. channel that a good demosaicer should produce for final display to the viewer in the output color space of choice. So these are also the proportions in which the individual MTF curves from each color plane should be combined in order to obtain a perceptually relevant grayscale System MTF for that camera and lens, from which relevant metrics like MTF50 can be derived.
Note that the coefficient for the blue plane is negative, in this case -0.2923. Subtracting a blurred version of itself from an image is a well known post-processing sharpening trick called unsharp masking (USM). And in fact, the subtraction of the blue channel in the proportion above will increase the colorimetric System MTF compared to the input-referred System MTF discussed in the previous article. Nevertheless this mix of spatial frequencies is closest to what will actually be presented to, thus perceived by, the Human Visual System.