Eolake emailed me a good question:
"Logically it seems to me that diffraction should affect small formats much more than large ones. Since it's a physical interaction with the edge of the aperture, wouldn't the absolute amount of the diffraction be inverse with the size of the aperture in millimeters, not with the f-value?"
The absolute resolution is indeed inversely proportional to the absolute size of the aperture...but that resolution is measured as the angular separation of the subject details. Very approximately, the resolution in radians equals 1.3 times the wavelength divided by the diameter of the aperture. In other words,
r (in radians) = 1.3w/A.
To get the separation distance of details in a film/sensor plane, you multiply the angular separation by the focal length of the lens. In other words, the Circle of Confusion is:
C = F*r
And what is F/A? It's just the f-number! So there we are:
C = 1.3w * (f-number)
r (in line pairs (lp) per mm) = 1600/ (f-number) for green light.
This is the maximum resolution a lens can produce. Real world lenses can approach this limit but not exceed it.
The term "diffraction-limited" gets used in two different ways when we're talking about lenses. We say that a lens is being diffraction limited when we reach the point where stopping the lens down further reduces resolution instead of improving it (in the absence of diffraction, resolution would always improve as we stopped down, because some aberrations diminish with reduced aperture).
We also talk about a lens exhibiting "diffraction-limited performance" when its resolution is near the diffraction limit. That's not the same thing, and it causes unavoidable confusion. Many lenses will become diffraction-limited before they're stopped down enough to exhibit diffraction-limited performance. In fact, if it's a poor lens, it may never achieve diffraction-limited performance.
On the other hand, really, really good 35mm-format lenses may become diffraction limited at apertures as large as ƒ/4–ƒ/4.7 and exhibit diffraction-limited performance at ƒ/4.7–ƒ/7.
The edge of the aperture doesn't directly produce diffraction. Diffraction isn't caused by photons scattering off of the edges of the aperture. If that were the case, we could markedly reduce diffraction by blackening the diaphragm blades so much that almost no photons were scattered off of them. That improves flare and micro-contrast, but it doesn't do a thing for diffraction.
It's the mere existence of the edge that causes diffraction. It's a quantum mechanical "spooky action at a distance" thing. The presence of aperture blades means the incoming photon is constrained to a particular region of space. That automatically means that the degree to which the photon is moving sideways isn't precisely defined. The diaphragm hasn't warped the paths of the light rays; its presence makes those paths less predictable. The closer together the aperture edges, the less predictable the sideways spreading. In English—as you stop down, diffraction increases.
See, even if you don't believe in quantum weirdness, you deal with it every day of your photographic life!
Next time: what's the practical import of this for photographers? Is diffraction the ultimate sharpness-killer?
Featured Comment by Bernard: "Anybody who is really interested in knowing how quantum physics affects all aspects of photography should read Richard Feynman's QED: The Strange Theory of Light and Matter, which could have been subtitled 'How I Learned to Stop Worrying and Love Quantum Electrodynamics.'"