...Why the new Sony A7r might not be a good workhorse for your adapted Leica lenses.
Reason 1: Mechanical slop is already a bit of a crapshoot with just one interface. As an adapter adds two more to the system, the chances for slop are multiplied. See Roger Cicala's article "There Is No Free Lunch, Episode 763: Lens Adapters" at LensRentals. And thanks to Nigel for providing the link!
Reason 2: Bruno Masset wrote this next as a comment to the A7 post, not as a finished article, but I thought it was interesting enough that it deserved its own slot in the stack. What follows was...
Written by Bruno Masset
Methinks a lot of people who buy a full-frame mirrorless, hoping to use (for instance) old wide-angle rangefinder Leica lenses using mount adapters, are going to discover how mediocre the imaging performance of such lenses might be on a digital sensor.
The Leica M9 sensor's cover glass thickness is 0.8mm; this is much thicker than the M8's 0.5mm, but presumably offers much better infrared filtering, avoiding the "black synthetic fabric being imaged as dark purple" syndrome that plagued the M8.
There's little reason to expect that the optical stack (OLPF, IR filter etc.) in front of the Sony A7's sensor is thinner than the Leica M9's.
In the film era, lenses were designed to be stigmatic in air. In other words, back in the days when digital sensors didn't exist, lens designers expected the medium present between the lens' last vertex (the rearmost glass surface) and the imaging plane (the film) to have a homogeneous refractive index (RI) of one.
Leica's (and other manufacturers') wide-angle rangefinder lenses (those with a focal length shorter than approximately 35mm) tend to have a quite short exit-pupil-to-imaging-plane distance. The exit pupil is the apparent position of the diaphragm, looking at the lens from behind. Obviously, the exit pupil is where the light cone appears to emanate from, and thus determines the tilt of the light cone's principal ray—the cone's "axis," if you will.
In the center of the image, the principal ray will be perpendicular to the imaging plane. In the corners of the image, if the distance of the exit pupil to the image plane is short, the light cone (and its principal ray) will be quite tilted.
A slab of optical material—a sensor's cover glass, or an optical low-pass filter stack—has a refractive index higher than one, and will spatially shift the light rays impinging upon it—the amount of this spatial shift being dependent on the ray's incidence angle.
As a light cone is composed of rays that have varying angles, each of these rays is going to be spatially shifted by a variable amount.
The net result is that a light cone that was designed to converge to a single point—i.e., be stigmatic—in air, can have its light rays dispersed all over the place after it passes through a slab of material with refractive index larger than one.
The cone's dispersion is negligible when the incidence angle is perpendicular, at the center of the image.
With a sensor cover glass thickness of 0.8mm (800 microns) and a RI of 1.5, an exit-pupil-to-imaging-plane distance of about 30mm (a fairly typical value for wide-angle rangefinder lenses with focal lengths less than or equal to 35mm), and a cone aperture of ƒ/4 (that is, an exit pupil diameter of 30mm/4.0 = 7.5mm), one can easily calculate that in the image corners, the light cone's rays, instead of converging to a point, will be dispersed over an area of about 35 microns.
On a Leica M9, 35 microns corresponds to about five pixels!
So, what was imaged as a point when that film-era lens was used in air, could well become a fuzzy disk of a diameter of about five pixels when the lens is used with a digital sensor!
The further from the image's center (where the principal ray is perpendicular, and the cover glass has thus essentially no influence), the worse the image quality will be.
Do microlenses help?
Microlenses only address the issue of a ray's incidence angle on a single pixel, increasing the pixel's sensitivity and limiting the leakage of a tilted light ray towards adjacent pixels in a Bayer array and the resulting apparition of color shifts in the image's corners.
As microlenses cover only one pixel, they obviously have no effect on the imaging of a fuzzy disk that has a diameter of five pixels. In other words, it would be utterly naive to assume that per-pixel microlenses can address an astigmatism issue that straddles multiple pixels.
What can address the astigmatism problem to some extent, however, is digital lens correction.
The spreading of the light cone induced by the optical stack is caused by well-known physical phenomena, and its point-spread function (PSF) can be modelled as a function of the exit pupil distance, exit pupil diameter (that is, aperture value), cover glass thickness and RI, and distance from the image center.
The appropriate PSF can then be used to deconvolve the blurriness induced by the cover glass.
I suspect that's the major technical breakthrough Leica—or its digital imaging technology partner, Jenoptik—achieved, and that allowed them to increase the sensor's size from the Leica M8's initial APS-H dimensions to the M9's full-frame format, while at the same time increasing the cover glass' thickness from 0.5mm to 0.8mm.
One way to check for the presence of an (unpublicized and hypothetical) PSF deconvolution that could be applied by Leica, even on the raw DNG files, would be to compute the autocorrelation of a pixel with its neighbors. As the PSF diameter must become larger as one moves away from the image center, one can expect the per-pixel autocorrelation measurements in Leica's raw files to also increase in diameter from the center towards the corners.
If Leica indeed performs deconvolution processing in-camera, one can expect marked image quality differences in the image corners between a Leica camera and other manufacturers' mirrorless cameras, even using the same Leica rangefinder lens.
Bruno
• • •
Nice explanation, seems to me, insofar as I can understand it. Me, I'm an empiricist...I'd just stick the lenses on the cameras and see what happens. However, after years of trying various adapters, I came to the same conclusion as Roger and Bruno: I pretty much just use cameras with the lenses that were designed to go with them. Using adapters can be fun, especially if you're reincarnating older, unused, or quirky lenses for modern duty, or repurposing lenses for one camera for use with another in order to expand your arsenal, but adapters are usually a roadblock on the path to optical optimization. So if that's what you're after, I don't recommend them.
Mike
(Thanks to Nigel and Bruno)
Original contents copyright 2013 by Michael C. Johnston and/or the bylined author. All Rights Reserved. Links in this post may be to our affiliates; sales through affiliate links may benefit this site.
TOP's links!
(To see all the comments, click on the "Comments" link below.)
Featured Comments from:
John Flower: "I'm glad I don't shoot test shots for a living, as apparently my NEX body and range of MF lenses and adapters are not up to the task. However, with the exception of certain problems at infinity focus (I haven't got around to shimming all my adaptors yet) the performance of my various lenses is as would be expected, from an LTM Voigtlander 15mm ƒ/4.5 ASPH right up to an old Zeiss 135mm, and including various Leica M and R lenses, Nikkor F-mount lenses and the odd M39 lens. There are some dogs (for example, my Nikkor 105mm ƒ/2.5 Sonnar that was so good on film is lously on all my digital bodies) but in general there are no surprises.
"For those commenting on poor wide-angle results on the M9 (colour shift, corner softness), this was a known issue and has been addressed on the A7 and A7R at the sensor level (optimised micro-lenses); I can attest to the fact that no such problem was present in a number of other NEX bodies. Even the most cursory investigation of the vast source of full-size images taken using NEX bodies and quality lenses mounted on adaptors will show that in a practical sense, if you are willing to put up with the compromises inherent in this type of setup, the results can be outstanding."
Luke: "I put my Zeiss ZM 21mm ƒ/2.8 on the A7r we have here [Luke works at Imaging-Resource —Ed.]. I'm sad to report that color shifts were severe and covered most of the frame. There was also severe darkening of the image away from the center, way too much to simply call vignetting. I chose this lens carefully, based on the experience of other users, to avoid this problem with my early NEX cameras. It worked well, with visible but very minor color shifts and vignetting. It stayed glued on my NEX-3 for years. Well, I just saved myself a couple grand."