Last column I introduced you to diffraction. This time, I'm going to explain why you shouldn't pay it so much attention.
How do the component blurs in an optical system combine? Usually, the following equation is a good approximation to the total amount of blur you'll get:
TotalBlur^2 = Blur1^2 + Blur2^2 + Blur3^2...
In English, the total blur squared is the sum of the squares of the individual blurs.
Just to make sure you understand this equation, imagine you have a camera with a film and lens resolutions of 100 line pair per millimeter (blur circle of 10 µ). Then the total blur will be 14 µ and the final resolution will be 70 lp/mm.
Up the film resolution to 150 lp/mm, and the final resolution will be about 83 lp/mm. If your film resolves 200 lp/mm, then you're near 90 lp/mm. That's close enough to the resolution limit of the lens that further improvements in film aren't going to produce a visible gain.
Rule of thumb: if your worst blur isn't twice as bad as all the other blurs, then improving any of the blurs will improve the image, although (obviously) improving the weakest link gets you the biggest gain.
With film cameras, you've got five major sources of blur to worry about: film resolution, vibration, focus error, lens aberrations, and diffraction. Until one of them is twice as bad as all the others, improving any of them helps.
Focus error is the biggie that photographers don't think about. Cameras usually have SERIOUS focus errors, if you're trying to work on the scale where diffraction actually matters. The error has a variety of sources, from a mismatch between the distance to the focusing screen/autofocus sensor and the film/sensor plane to the finite step size in the encoders and servomotors in electrically-focusing cameras. It's a bigger problem than you would imagine. In an otherwise-good system, it is often the biggest source of image blur.
Film resolution is easy to characterize; digital sensor resolution is almost impossible! For a start, there are all the reasons mentioned in the article by Ruben et al. Add in several more:
• Even a 2 x 2 pixel array is a bare minimum. Does fitting the Airy disk into a square provide high quality image data? Nope. You're going to see improvements in the quality of the fine detail up to at least a 4x4 pixel array.
• Sensor pixels don't even operate entirely as discrete elements. There is charge leakage (analogous to a halation in photographic film) between adjacent pixels. This degrades sensitivity, sharpness, color fidelity, and noise, above what "theory" would tell you.
• Finally, there is the image processing that produces a viewable image. Some of you may have noticed how peculiar it is that the number of lines of resolution in many digital cameras is about the same horizontally, vertically, and diagonally! That's impossible if you're talking about a physical blur circle. This is a synthesized image. The resolution you get is not a simple mapping of pixel data.
Put it all together and claims that pixels have become too small because of diffraction effects loses all credence. There's too much stuff going on that muddies the picture (literally).
The same test photograph as last time. This figure is at 150% full-size [after you click on it to enlarge it —MJ]. The camera is my Fuji FinePix S100fs. Pixel pitch is around 2.5 µ. Nothing much changes in terms of sharpness all way down to ƒ/5.6. Sure, the pixels are minuscule and diffraction is getting worse, but everything else matters much more. Even at ƒ/8, the impact of diffraction is modest, although the Airy disk is twice the pixel size. It's only going that final aperture from ƒ/8 to ƒ/11 where diffraction truly dominates image clarity.
The only way to figure out if diffraction is important in your photographs is to run some very, very careful tests and understand how to interpret the data. For most of you, honestly, it's a waste of your time comparable to running endless film tests instead of making nice photographs.