Many years ago I researched the issue of focus. I read about all the various means of determining and setting focus on various types of cameras, focusing aids and how they work, and the issues affecting accuracy and repeatability. I have to admit that I don't remember all of the technical arcana, so if I wanted to explain it all, I would have to go research it again. But learning it once was an eye-opener. It turns out that although focusing a camera is almost universally seen by practicing photographers as easy to do, it's in fact not a trivial technical problem, if we adopt suitably high standards for accuracy. Almost always, what we're doing when we focus our cameras is setting approximate focus. It's only because we're usually focusing on complex, 3-dimensional subjects, and because we're granted considerable leeway by depth-of-field, that approximate focus is usually good enough.
Carl and I just went through an interesting exercise. He had prepared an illustrated addendum to his article showing how he determined and set the custom lens focus adjustment on his K20D. The problem we had was that when he reduced the test files to web resolution, the fine distinction between in-focus lines and out-of-focus lines on the test chart literally disappeared. What he was seeing at full resolution in ACR was not possible to show at web resolution.
I noticed a different aspect of the same thing all those years ago when I did my research and tests on focus. The greater the resolution of the whole system, I found, the easier it was to detect exactly where the plane of focus was placed, and the more evident it became that objects just out of the d.o.f. were not in focus. Somewhat counter-intuitively, if I reduced the resolution of the whole system (for instance, by using faster, coarser-grained film and hand-holding the camera), it looked like more was in focus. All I was doing was making the highest levels of distinction between the plane of best focus and the closer zones of the d.o.f. disappear. To my eye, at least, this made most pictures look better, rather than worse.
It's easy to demonstrate the principle in Photoshop. Click on the picture above to bring up the larger version, and see if you can tell where the focus was placed. Not easy, is it? That's because I've applied a liberal amount of Gaussian blur to the entire picture. To my eye, it looks like everything from Vietnam Inc. to maybe Leonard Freed is resolved pretty much the same, and it's not evident where the plane of best focus actually is (except that you might reasonably assume it's in the middle there somewhere).
In this sharpened version that was not blurred, it's much easier to tell where the plane of best focus is—approximately on The History of the Japanese Camera. In this version, it's possible to see that both New York Rises and Lewis Hine are slightly out of focus—something that's invisible in the top picture.
This demonstration doesn't quite make the point, simply because the entire top picture looks obviously blurry. But, in fact, the entire bottom picture is fairly low resolution—it's a small JPEG, 72 ppi, and the camera was handheld with the lens wide open. In it, I can't tell much difference in sharpness between History, Bruce Davidson Portraits, and Wright Morris (the pink book on the far side of History.) Make a really high resolution picture of this subject, and you might even be able to tell which side of the spine of History is more in focus!
When you've achieved enough resolution so that the "sharp parts" look subjectively sharp, then having more of the d.o.f. match the best apparent focus can make more of the picture look sharp. That's why small apertures often make pictures look subjectively sharper even though, in objective terms, diffraction is degrading the sharpness of everything in the frame.
But back to focus. Strictly speaking, all methods of focusing have pitfalls and potential for error, and all types of focusing aids have their characteristic limitations. That's even without taking into account the mechanical "slop" and the general imprecision of our devices (and other factors that affect apparent focus, such as film flatness or the flatness of field of our lenses). Once you add that in, it's fortunate indeed that only approximate focus is commonly required, because we'd be out of luck if we had to do better consistently!
As it was, I adopted one 'best practice' that I continue to observe habitually to this day: I don't use a lens's widest aperture at its closest focus distance. If I'm focused as close as the lens will go, I make sure to stop down at least somewhat, and if I'm using the widest aperture, I make sure I'm focused at least a little farther away than the lens's closest focus point. The reason is simple enough: there just aren't very many camera/lens combinations that are both precise enough and accurate enough to focus as close as they can go with the lens wide open. Assuming you'll get good focus in such situations just leads to frustration.
In the case of AF systems, consumer cameras, especially ones that are less than top-of-the-line models, are not terribly high-precision devices (they have to be pretty darned good, all things considered, but they're never perfect). There's a fair amount of slop in the design and construction that has to be overcome. But even so, most AF systems are "detuned" from their best theoretical performance. The reason is that the more accurate the AF and the finer the system's tolerance, the slower the system will achieve focus and the more it will hunt. If the tolerance is widened, speed increases and hunting decreases. The camera-makers know that the typical end-user is simply less aware of precise focus than he is of AF speed and hunting, so they compromise. In the best cameras, not only is the system itself better and more capable, but the balance between focus accuracy and AF speed is skewed more towards accuracy (despite which, the AF still has to be fast). That kind of performance, of course, comes at a cost.