In my last column on this topic I talked about how we see tones. This time it's about detail.
For a start, let's consider the canonical 8–10 line pair per millimeter (lp/mm) that folks like to toss out as the limit of human visual resolution. That number is simultaneously misleadingly high and low.
To begin with, those experiments were run on young viewers with good eyes under optimal viewing conditions and illuminance, at a standard close focusing distance of about 1/2 meter. Change any of those conditions and the numbers go down. Poor lighting in particular makes a big difference. So does an inability to focus that close. Under many conditions that are normal for viewing photographs, you really can't see more than five line pair per millimeter. (Conversely, higher visual acuity or closer focusing ability can up those numbers.)
There's a big difference between our ability to resolve closely spaced line pairs and our ability to detect edges. You can easily see a distant phone line that is several times smaller than the resolution limit or stars that are thousands of times smaller. Add in "vernier acuity:" that's what lets you use a caliper or micrometer and tell when the index lines are out of alignment by even a small fraction of the engraved linewidth. We're very good at picking up edge qualities.
Take two resolution bar targets, one of which has sharp-edged bars and the other of which has very fuzzy-edged bars. One can resolve about the same amount of fine detail in both sets of targets, but even at the limit of visual resolution most people will readily be able to pick out the former target as "sharper" than the latter. Even though they're not seeing any more fine detail, they are responding to the transition at the edges—acutance, not resolution.
That means eyes are sensitive to spatial frequencies about three times finer than what they can actually resolve, and experiments bear that out. Take two matched 8x10 prints, one of which resolves 10 lp/mm and the other of which resolves 30 lp/mm. Ask folks to pick out the sharper one. Invariably they can, although if pressed they'll be unable to point out any specific detail that is different.
Visual resolution versus acuity. These two sets of radial bars (you might want to click on the image first to open up a larger version, and pardon the jaggies) have very different edge characteristics. Human vision picks up on that, even at the limits of resolution. Step across the room from your monitor and look at this image. You should be able to resolve about the same level of detail in both targets (the point at which the bars merge into a gray central disk), but the target on the right should appear a little crisper even near the finest level of visible detail. That's your eye picking up edge characteristics well beyond the resolution limit.
This is why a print from a sheet film negative looks sharper than one made from a well-made 35mm negative, even though that 8–10 lp/mm number says the big negative shouldn't have a marked advantage. Similarly, I have tested inkjet printers that can reproduce in excess of 15 lp/mm and ones that reproduce 8–10. The naive number says there shouldn't be a noticeable difference in side-by-side comparison prints. Experiment proves otherwise.
On the other hand, small differences in readily visible detail (coarser than 8 lp/mm) are not detectable even in side-by-side comparisons. You can't see a 10% improvement in resolution; around 15% is the plausible limit. The difference won't become particularly noticeable until you hit 25–30%.
This has bearing on the pixel horsepower race in digital cameras. There's little point to trading in your camera for one that has 25% more pixels unless the overall image quality is better for other reasons (tonality, color, dynamic range, better optics or signal processing, or proven substantially-better resolution). You would be hard-pressed to see a sharpness difference because of the raw pixel count. If you're not making at least a one third to one half jump in the number of pixels, save your money.