...Why the new Sony A7r might not be a good workhorse for your adapted Leica lenses.
Reason 1: Mechanical slop is already a bit of a crapshoot with just one interface. As an adapter adds two more to the system, the chances for slop are multiplied. See Roger Cicala's article "There Is No Free Lunch, Episode 763: Lens Adapters" at LensRentals. And thanks to Nigel for providing the link!
Reason 2: Bruno Masset wrote this next as a comment to the A7 post, not as a finished article, but I thought it was interesting enough that it deserved its own slot in the stack. What follows was...
Written by Bruno Masset
Methinks a lot of people who buy a full-frame mirrorless, hoping to use (for instance) old wide-angle rangefinder Leica lenses using mount adapters, are going to discover how mediocre the imaging performance of such lenses might be on a digital sensor.
The Leica M9 sensor's cover glass thickness is 0.8mm; this is much thicker than the M8's 0.5mm, but presumably offers much better infrared filtering, avoiding the "black synthetic fabric being imaged as dark purple" syndrome that plagued the M8.
There's little reason to expect that the optical stack (OLPF, IR filter etc.) in front of the Sony A7's sensor is thinner than the Leica M9's.
In the film era, lenses were designed to be stigmatic in air. In other words, back in the days when digital sensors didn't exist, lens designers expected the medium present between the lens' last vertex (the rearmost glass surface) and the imaging plane (the film) to have a homogeneous refractive index (RI) of one.
Leica's (and other manufacturers') wide-angle rangefinder lenses (those with a focal length shorter than approximately 35mm) tend to have a quite short exit-pupil-to-imaging-plane distance. The exit pupil is the apparent position of the diaphragm, looking at the lens from behind. Obviously, the exit pupil is where the light cone appears to emanate from, and thus determines the tilt of the light cone's principal ray—the cone's "axis," if you will.
In the center of the image, the principal ray will be perpendicular to the imaging plane. In the corners of the image, if the distance of the exit pupil to the image plane is short, the light cone (and its principal ray) will be quite tilted.
A slab of optical material—a sensor's cover glass, or an optical low-pass filter stack—has a refractive index higher than one, and will spatially shift the light rays impinging upon it—the amount of this spatial shift being dependent on the ray's incidence angle.
As a light cone is composed of rays that have varying angles, each of these rays is going to be spatially shifted by a variable amount.
The net result is that a light cone that was designed to converge to a single point—i.e., be stigmatic—in air, can have its light rays dispersed all over the place after it passes through a slab of material with refractive index larger than one.
The cone's dispersion is negligible when the incidence angle is perpendicular, at the center of the image.
With a sensor cover glass thickness of 0.8mm (800 microns) and a RI of 1.5, an exit-pupil-to-imaging-plane distance of about 30mm (a fairly typical value for wide-angle rangefinder lenses with focal lengths less than or equal to 35mm), and a cone aperture of ƒ/4 (that is, an exit pupil diameter of 30mm/4.0 = 7.5mm), one can easily calculate that in the image corners, the light cone's rays, instead of converging to a point, will be dispersed over an area of about 35 microns.
On a Leica M9, 35 microns corresponds to about five pixels!
So, what was imaged as a point when that film-era lens was used in air, could well become a fuzzy disk of a diameter of about five pixels when the lens is used with a digital sensor!
The further from the image's center (where the principal ray is perpendicular, and the cover glass has thus essentially no influence), the worse the image quality will be.
Do microlenses help?
Microlenses only address the issue of a ray's incidence angle on a single pixel, increasing the pixel's sensitivity and limiting the leakage of a tilted light ray towards adjacent pixels in a Bayer array and the resulting apparition of color shifts in the image's corners.
As microlenses cover only one pixel, they obviously have no effect on the imaging of a fuzzy disk that has a diameter of five pixels. In other words, it would be utterly naive to assume that per-pixel microlenses can address an astigmatism issue that straddles multiple pixels.
What can address the astigmatism problem to some extent, however, is digital lens correction.
The spreading of the light cone induced by the optical stack is caused by well-known physical phenomena, and its point-spread function (PSF) can be modelled as a function of the exit pupil distance, exit pupil diameter (that is, aperture value), cover glass thickness and RI, and distance from the image center.
The appropriate PSF can then be used to deconvolve the blurriness induced by the cover glass.
I suspect that's the major technical breakthrough Leica—or its digital imaging technology partner, Jenoptik—achieved, and that allowed them to increase the sensor's size from the Leica M8's initial APS-H dimensions to the M9's full-frame format, while at the same time increasing the cover glass' thickness from 0.5mm to 0.8mm.
One way to check for the presence of an (unpublicized and hypothetical) PSF deconvolution that could be applied by Leica, even on the raw DNG files, would be to compute the autocorrelation of a pixel with its neighbors. As the PSF diameter must become larger as one moves away from the image center, one can expect the per-pixel autocorrelation measurements in Leica's raw files to also increase in diameter from the center towards the corners.
If Leica indeed performs deconvolution processing in-camera, one can expect marked image quality differences in the image corners between a Leica camera and other manufacturers' mirrorless cameras, even using the same Leica rangefinder lens.
Bruno
• • •
Nice explanation, seems to me, insofar as I can understand it. Me, I'm an empiricist...I'd just stick the lenses on the cameras and see what happens. However, after years of trying various adapters, I came to the same conclusion as Roger and Bruno: I pretty much just use cameras with the lenses that were designed to go with them. Using adapters can be fun, especially if you're reincarnating older, unused, or quirky lenses for modern duty, or repurposing lenses for one camera for use with another in order to expand your arsenal, but adapters are usually a roadblock on the path to optical optimization. So if that's what you're after, I don't recommend them.
Mike
(Thanks to Nigel and Bruno)
Original contents copyright 2013 by Michael C. Johnston and/or the bylined author. All Rights Reserved. Links in this post may be to our affiliates; sales through affiliate links may benefit this site.
(To see all the comments, click on the "Comments" link below.)
Featured Comments from:
John Flower: "I'm glad I don't shoot test shots for a living, as apparently my NEX body and range of MF lenses and adapters are not up to the task. However, with the exception of certain problems at infinity focus (I haven't got around to shimming all my adaptors yet) the performance of my various lenses is as would be expected, from an LTM Voigtlander 15mm ƒ/4.5 ASPH right up to an old Zeiss 135mm, and including various Leica M and R lenses, Nikkor F-mount lenses and the odd M39 lens. There are some dogs (for example, my Nikkor 105mm ƒ/2.5 Sonnar that was so good on film is lously on all my digital bodies) but in general there are no surprises.
"For those commenting on poor wide-angle results on the M9 (colour shift, corner softness), this was a known issue and has been addressed on the A7 and A7R at the sensor level (optimised micro-lenses); I can attest to the fact that no such problem was present in a number of other NEX bodies. Even the most cursory investigation of the vast source of full-size images taken using NEX bodies and quality lenses mounted on adaptors will show that in a practical sense, if you are willing to put up with the compromises inherent in this type of setup, the results can be outstanding."
Luke: "I put my Zeiss ZM 21mm ƒ/2.8 on the A7r we have here [Luke works at Imaging-Resource —Ed.]. I'm sad to report that color shifts were severe and covered most of the frame. There was also severe darkening of the image away from the center, way too much to simply call vignetting. I chose this lens carefully, based on the experience of other users, to avoid this problem with my early NEX cameras. It worked well, with visible but very minor color shifts and vignetting. It stayed glued on my NEX-3 for years. Well, I just saved myself a couple grand."
Therein lies the beauty of the Micro 4/3 system. The sensor is smaller so it only uses the central area of the lens which tends to be the sharpest in many cases. It also avoids the acute incidence angle at the edges.
Posted by: Pete Teoh | Wednesday, 16 October 2013 at 04:27 PM
I think Lens Rental's adapter article gets a bit overblown mainly due to the reason that I stated in its comments: alignment is a big issue with wideangles shooting at infinity or at planar targets, but at 3D targets not at infinity not so much. Also, it would be wise to see if Roger can improve his test setup and whether there would be someone else to perform tests to verify the results.
As for Bruno's post, he brings up a valid point. I would imagine that it would be easier to test the Leica by taking pictures with a lens with the 6-bit coding in place and taped over and compare the corner sharpness of the images.
Personally, I'm getting stellar results on a Nex-7 with a ZM Biogon 35/2 and Planar 50/2, which have changed my mind on adapting lenses and also made using a Nex enjoyable.
Posted by: Oskar Ojala | Wednesday, 16 October 2013 at 04:44 PM
Oh! Em! Gee!
Two years ago I spent more money than my whole collection of 35mm gear cost for a digital body: the Sony a850. I already had a Minolta Maxxum system and 5 lenses that would fit the Sony. I’ve never been disappointed in it. But what do I know?
Now it’s worth half what I paid for it. Why? Well, first there was the a77 and a99. Now there’s the a7 and a7r.
Oh, and now I find out my old film lenses really won’t work that well on digital bodies (even without adapters) because “a slab of optical material—a sensor's cover glass, or an optical low-pass filter stack—has a refractive index higher than one, and will spatially shift the light rays impinging upon it…”
And, just a few weeks ago I bought a small Panasonic Lumix camera, only to discover that my computer won’t successfully run its accompanying software because the clock speed is too slow.
About six months ago I also upgraded my Maxxum system by buying a pristine Maxxum 7 body for $180! It is flawless. And I gotta tell ya, the Maxxum 7 and Tri-x is looking better all the time.
Posted by: David Brown | Wednesday, 16 October 2013 at 04:45 PM
I, too, have long discovered that using adapted lenses simply does not produce any advantages. I've a long list
of cameras I've strapped my costly inventory of M glass onto, all with largely the same meh results.
Not that I won't try it once again with my A7r. But I expect another "meh" coming on.
The design and electronic correspondence between today's cameras and their lenses often basically, and subtlely, eliminates the practical aspects of interbrand adaptation. Yes there are always guys who swear their Leica 35mm looks better on their Oly or Sony than the kit lens. But I've never seen any proof of such claims. None.
Posted by: Kenneth Tanaka | Wednesday, 16 October 2013 at 05:10 PM
I used my Leica lenses on M4/3 for a while and whilst the results were good ... the experience was not. After a while, the 20mm Panasonic won through for me in almost every respect (ergonomics and image quality)
On another note, some of the older lenses just don't cut it on these modern sensors (especially in the corners). I have a pre-asph 35/2 and after about 4 months of shooting, I bought the modern aspherical version.
The deconvolution processing explanation makes sense, but I'm not entirely convinced (even though I hear the M9 does smooth DNG files at ISO400 and up).
On my CV 15, colour shifting due to this non-perpendicular ray affect is really evident in the corners. Thankfully, there's a free app to fix that in post:
https://sites.google.com/site/cornerfix/
Pak
Posted by: Pak-Ming Wan | Wednesday, 16 October 2013 at 05:13 PM
All the slop in those lens adaptors makes you wonder how anyone ever got anything sharp out of a view camera.
As an adaptor user who's had a mix of results ranging from really good to really idiosyncratic to just plain awfull , here are some observations.
Alignment with an adaptor is exactly as bad as using an extension tube but not as bad as using a tele-extender or a bellows.
Alignment of the mount itself is only important at infinity and on a copy stand or photographing things like paintings. In the case of photographing paintings or using a copy stand it's trivially easy to correct for that sort of misalignment, about the same as it was for film cameras and enlargers.
Flange depth with modern floating element lenses or zooms os a much bigger deal. The Nikkor 28mm 2.8 AIS in particular has been super sensitive to this problem.
Screw mount Leica and Pentax thread lenses have zero problems with flange depth or alignment in my experience. Manual focus Canon FD breach-lock lenses should work well to but I haven't tried them. Various bayonet mounts are just as bad as they were for third party film camera lenses and extension tubes.
Some old lenses have astonishingly curved fields of focus. Goofy adaptors have no impact one way or another with them.
If you think that adapting old lenses is a cheap and convenient way to get the same results as new native mount lenses you are going to be very disappointed.
If you think that adapting old lenses is a cheap and *convenient way to use old lenses without hunting down the old cameras that go with them, repairing and calibrating said cameras, buying and processing film, scanning and so forth, you might be on to something.
*compared to wet plate photography for instance.
The micro-lens + coverplate + exit pupil problem is not difficult to solve optically and from the materials Sony presented it looks like they have optimized for an exit pupil fairly close to the image plane. I hope it doesn't mess things up with telephoto lenses. The Nikkor 200mm f/4 is pretty wonderful on the NEX.
Posted by: hugh crawford | Wednesday, 16 October 2013 at 05:27 PM
Further problems with adapters... I was just reading Ming Thein's article about the new FF Sonys here, and he says in a side note:
"Don’t think you can get away with adaptors: the planarity of such adaptors is going to be absolutely critical, especially with such short flange distances and resolution numbers. You’ll actually be able to see the effects of a cheap, out-of-plane adaptor – it looks a little like a tilt. (I know this because I tried Hasselblad lenses on my D800E; none of the three adaptors I obtained had sufficiently tight tolerances to avoid this problem.)"
Posted by: Eolake Stobblehouse | Wednesday, 16 October 2013 at 05:27 PM
If I understand the explanation, for photographers who prefer to see with a longer than normal lens, as I do, I suspect it isn't as much an issue.
Posted by: Auntipode | Wednesday, 16 October 2013 at 05:32 PM
" ... adapters are usually a roadblock on the path to optical excellence."
But, then can help the artist achieve a look optical excellence cannot deliver without filters and PS.
Posted by: darr | Wednesday, 16 October 2013 at 05:37 PM
To clarify My comments in the previous post apply to lenses designed for SLR cameras, enlargers and long focus rangefinder cameras. Short focus i.e. wide angle non retro-focus lenses look like dog poo on digital but that has nothing to do with whether there is an adaptor involved. M mount Voigtlander 15mm and 12mm lenses look pretty bad on a M9.
Posted by: hugh crawford | Wednesday, 16 October 2013 at 05:44 PM
The thinness of the cover glass/OLPF/IR filter over the Leica digital M mount cameras has been mentioned by Zeiss in one of their lens papers talking about the relative merits of symmetric lenses (older M mount lenses) and asymmetric (more modern telecentric lenses) and digital sensors. See "Wide-angle lenses and digital sensors" on page 11 of H. H. Nasse in Distagon, Biogon and Hologon.
They show some MTF (center to corner plots) for a Leica thickness cover glass (0.5mm) and "thicker filter in front of the digital sensor" which is the more typical 2.5mm cover glass plus OLPF stack. You can see the clear difference between the two.
Keeping the stack above the sensor thin is the key critical criterion in getting the older symmetric wide M mount lenses to work well with the digital Leicas.
Note: This also seems to be the primary reason Leica omitted an OLPF (anti-aliasing filter) from their cameras. Not to make the camera "sharper" (or more susceptible to moire) but to make it compatible with their older symmetric lenses. Though it seems to have started a trend even though camera companies that omit it use a thick cover glass (e.g. Fujifilm X-Pro 1 has no OLPF but a 2.5mm cover glass ... ugh!)
The NEX7 sensor has a 2.5mm thick sensor cover glass and an OLPF which is pretty typical of a sensor designed for telecentric lenses. This resulted in "odd colors" in the corners when using old (symmetric) M mount lenses with it as the rays hit adjacent photosites (so adding "color" when demoasiced). But this issue is reported as fixed in the NEX6. Sony haven't (AFAIK) haven't said what they changed between the two sensors but cover glass thickness, a weaker (perhaps thinner?) OLPF and microlens changes are all possible.
The CMOSIS sensor that Leica use in the M240 does have a microlens design that makes the microlenses at the edge of the sensor higher (and I suspect stronger) but I guess this might have more to do with compensating for vignetting from off axis rays. And a thinner cover glass.
The Micro 4/3 system (and pretty much all the other mirrorless and DSLR systems) are designed for telecentric lenses (that keep the rays closer to normal even at the edge of frame). Of course, the smaller sensor does end up not using extreme ray angles of non-telecentric lenses but at the expense of the 2x crop factor but it will suffer similar issues if you use an older symmetric lens on an adaptor.
The IR filter is a multilayer dielectric mirror (tuned for the IR) on top of the cover glass. It's not the thickness of the cover glass that provides the IR filter but the much thinner multilayer on top of it. It's reflective not absorptive.
Posted by: Kevin Purcell | Wednesday, 16 October 2013 at 05:47 PM
Thom Hogan has written no less than three articles on related issues about these new Sony cameras. http://www.bythom.com/
Posted by: John Krumm | Wednesday, 16 October 2013 at 05:49 PM
The five-pixel example remains a hypothesis until you know the refractive index of the glass over the sensor. Are we to believe this is a new issue for the digital camera manufacturers?
The point of adapted lenses usually includes fondness for the character of the lens. I like the imperfections of the Sonnar formula, for example. So we have a tradeoff between the character of the image and a guess at the amount of pixel-level disarray.
As for mechanical interfaces, if an adapter is bad, try another one.
Posted by: Charles | Wednesday, 16 October 2013 at 07:25 PM
Wow, I'm glad I don't play in that rarefied air of Leica lenses. What's the marginal image quality bump for every $1 spent above $1000 for a lens? But then again, I've never understood the luxury goods market.
Posted by: Art in LA | Wednesday, 16 October 2013 at 07:48 PM
Mr. Masset is only making assumptions and is incorrect about Leica's IR filter. Oddly enough, I was just reading a post from a few years ago about Leica's IR filter, from a gentleman who works in the industry:
"The Leica IR filter is a thinner version (0.5mm) of the standard plate of the cheapest possible material, the Kyocera BS7. Other manufacturers choose a more expensive plate material that's half the thickness (0.28mm is standard) and still gives better cut-off steepness, better partial refraction index and lower reflection index than the 0.5mm Leica - even in the base models. Here you have half the angle-dependent problem.
So when Leica say "we chose a thinner plate" they're really feeding you a lot of BS, true only in the sense that they could have chosen a broken beerbottle in stead.
The angle-dependent efficiency of the sensor is clearly stated in the (now withdrawn and "classified", probably by Leica request) Kodak spec-sheet for the sensor - and it's "not good" to "really bad" depending on how you compare it to modern constructions. Loss at 25º incidence (with compensating microlenses) is more than twice that of comparable Canon/Sony/Nikon sensors - mostly due to the depth of the cell structure and the low fill-factor. In fact it's a lot worse than the 16x smaller cell compact-camera 1.65micron SONY BSI sensor...
The Leica has built in corner colour and vignette compensation in a firmware database, other manufacturers don't have to do this. Especially the (predictable) colour contamination between the cells would have gotten other brands' developing teams either sent back to the drawing board, or fired. Please do note that the corner colour cast follows the RGGB quad layout symmetry of the Bayer... The birefringence theory that many Leicaphiles state as the cause is purely fictional. Cell-cell leakage before the rays hit the sensor surface proper is the mathematical model that the Leica firmware correction is built after..."
Posted by: GH | Wednesday, 16 October 2013 at 07:59 PM
Sounds like a strong argument in favor of the telecentric lens designs used by DSLRs.
Posted by: Dave in NM | Wednesday, 16 October 2013 at 08:10 PM
I've tried my film-era 35mm and 50mm Summicrons on a number of non-Leica digital cameras without ever managing to convince myself of anything special about their rendition. Ultimately all they did for me was to help demonstrate just how good modern designed-for-digital lenses really are.
There's also a more prosaic reason I gave up on old -- sorry, "legacy" -- lenses. Focusing wide open and then manually stopping down to the working aperture gets tiresome awfully quickly. It's fine for a bit of fun, but for a working photographer it feels like a klunky and unnecessary extra step.
Posted by: JK | Wednesday, 16 October 2013 at 08:54 PM
Interesting but difficult for me to understand. I had trouble with the Cone of Silence and now there is a Cone of Light?
Posted by: Grix | Wednesday, 16 October 2013 at 09:16 PM
Well I have no expertise to comment on that highly technical explanation but I have a different reason to find the announcements less exciting than hoped.
I've been shooting with a Nex-7 daily for over a year and a half now and beginning to run into the problems electronic devices all seem to run into between 18 - 24 months so I was really looking forward to my next camera being one of the exciting new Sonys. So, if I could justify spending another $1700 I'd have gone for the A7. (Not only was the A7r even more expensive, but I'd have needed to buy a new computer to deal with those gigantic files, and a new hard drive).
But it turns out my e-lenses will only work with a lot of vignetting or as a 10MP crop sensor camera. If I want to buy a new lens (other than the kit lens) it seems like it'll cost me the guts of another grand (or more). Or, I can buy a $300 adapter to use my old Minolta or Nikon glass and turn my tiny street camera into a tank with an adapter and a honking big lens on the front of it - kind of defeats the purpose.
Also - as a street photographer, especially in the NY subways, I count on being unobtrusive and silent but, according to one of the early previews I've seen, the full-frame shutter is noticeably noisy.
So basically, I'd have to spend $2700 to get a noisier 24 MP camera than the one I have. I realize it would be better in many other respects (better menus and buttons, faster auto-focus, weather-sealing, etc.) but those aren't $2700 features for the way I shoot.
So maybe I'll buy another Nex-7 or a Nex-6 - seems like they'll be much cheaper alternatives that let me shoot at the same image quality I've got now, quietly, unobtrusively, and with the lenses I already have.
Adam
Posted by: Adam Isler | Wednesday, 16 October 2013 at 09:39 PM
i have to say that in my experience with the NEX-7 both of these issues are not really field relevant. my best manual focus lenses from 20-50 years ago consistently outperform the native glass i try out (the same was true when i shot 4/3 before somebody starts disparaging NEX lenses).
the astigmatism caused by refraction through cover glass is really only a problem for wide angle symmetric rangefinder lenses and it's just as much a problem for leica as it is for sony. the best solution (other than using one of leica's newer more retrofocal designs) is to minimize the refraction of the cover glass as much as possible by choosing the appropriate materials and throwing out the ones you don't really need (like AA filters), and there's a chance sony may have actually put some thought into that for the a7r. software correction of raw files can be applied to the sony as well (including PSF deconvolution). though no profiles exist yet you can bet somebody will start working on them if there is enough demand.
using an adapter doubles the error you can get due to asymmetry versus what you can get from a non adapted native lens (asymmetry of native lenses is also measurable and is one of the sources of sample variation btw). it's pretty easy to test your adapter though and tighten it (screws and springs) if necessary to make sure it performs well. there will still be some offset that is measurable on an optical bench, but it won't effect what you see from the lens on todays sensors. here's the most difficult example i can find of my own: http://www.fredmiranda.com/forum/topic/1190503/6#11598449
it's a rather heavy lens (rokkor 58/1.2, ignore the rest they aren't that good in the corners on film either) shot wide open (and all the other apertures too) at infinity on a $20 adapter. if we were going to notice any detriment in in optical performance in the corners due to adapter offset it would be here, but every corner is just as sharp as the example one.
i've seen many examples from other people as well of adapted lenses outperforming native glass on both FF canon and nikon cameras and high pixel density aps-c cameras. for example the leica m 24/3.8 super-elmar soundly beating the sony zeiss 24/1.8 (particularly in the corners where a bad adapter should lead to more problems for the leica).
anyway, while these issues are certainly real i doubt they'll be any more of a serious detriment to achieving excellent performance on the a7r than they are on the m240 and in many cases i suspect the performance will be better on the sony. the experience of using the lenses will certainly be much more enjoyable to me too since i'll actually be able to frame properly and see actual dof when i shoot (i hate rangefinders).
Posted by: thomas hobbes | Wednesday, 16 October 2013 at 09:47 PM
I'm betting that within the next 5 years, we'll be seeing sensors made of flexible materials that can be curved. This will end the concerns about lack of perpendicularity of the light ray-to-sensor path. Material science engineers are headed in this direction now.
Posted by: Jamie Pillers | Wednesday, 16 October 2013 at 09:57 PM
I'll see what time tells. In the meanwhile I'll keep shooting my EM5 with my excellent primes from Oly, PanaLeica and Voigtlander. For me, the only reason to get an a7 is to be able to shoot my Leica M lenses. I very seldom go over 12x16" print.
Posted by: Marcelo Guarini | Wednesday, 16 October 2013 at 10:41 PM
I just do it and evaluate the results.
RF lenses perform best on cameras optimized to use them, particularly short focal length lenses ... The Leica M digitals and the Ricoh GXR A12 Camera Mount are the best choices for them. SLR lenses can do very well on a lot of mirrorless bodies, but it's on a one by one basis for some short focal lengths. Either RF or SLR lenses 50mm or longer work well on every digital body I've fitted them to (assuming mount register compatibility), including the M9, GXR, Nikon, Canon, FourThirds, and Micro-FourThirds bodies.
My strategy is to use M-bayonet and LTM lenses on the M9 and GXR. On Micro-FourThirds and FourThirds, I use native lenses up to 50mm focal length (my most used range) and adapt a small selection of Micro-Nikkors and Leica R lenses for when I want telephoto reach or have special purpose uses in mind.
It works out very nicely this way.
Posted by: Godfrey | Thursday, 17 October 2013 at 01:32 AM
I think this is nothing new, this business of adapters performance. It is very well known that for example wide angle M mount lenses have problems when adapted to NEX cameras. I have a Fuji X system, and Fuji makes a high quality M to Fuji X mount adapter, that is rock solid and has no play. It also recognizes the focal length of the lens, and has provision to fine tune lens corrections.
Of course, it is much more expensive than cheaper versions. You do get what you pay for. Fuji were very clever in doing this, and I think Sony will do a similar high quality adapter, if they see that as a significant market option. Fuji were also very clever in making a top quality wide angle for the Fuji X mount, the lovely 14mm lens. This lens prevents people from trying to use wide angle M mount lenses on the Fuji's.
Posted by: Paulo Bizarro | Thursday, 17 October 2013 at 04:28 AM
What about the refractive index of film? The sensitive chemistry isn't laid on top of the film, it's embedded in it at various depths. So the refractive index argument applies to film too.
Posted by: Chris Malcolm | Thursday, 17 October 2013 at 06:32 AM
I guess that the number of angels that can be accommodated on a pinhead will never be conclusively established. Michael Reichmann on LL seems to think that the Leica lenses work fine on these new "must-have" wonder-cameras although he doesn't mention which adaptor he's using as far as I recall. He's tried the Novaflex adaptor with the Nikon 14-24 and it's dandy, apparently. Maybe he got a good copy of the adaptor.
Posted by: Roy | Thursday, 17 October 2013 at 07:39 AM
Funny, I read the above post and thought it the exact reason why one should buy the Sony A7R over the overpriced Leica M9! The Sony A7R without OLPF will no doubt be thinner than the M9, and will thus have less than 5 pixel blurring as stated in the assumption post. Proof will be in the tests.
Also if you really care to add a 5K lens to a camera, you really should contract someone to make you a specific mount adapter for you camera and for you lens! Most of the problems people see is that no 2 cameras or lenses are the same!
Posted by: DavidB | Thursday, 17 October 2013 at 08:01 AM
In light of Luke's featured comment, it looks like the problems faced by covering a full-frame sensor from such a short flange-to-sensor distance are currently greater than the ability of micro-lenses to correct it in all cases. That said, the beam tilt through a ZM 21/2.8 is probably close to a worst-case scenario: look at the size and shape of the rear element, and consider also that it probably barely clears the shutter (other similar designs, such as various 35mm Biogons, won't even clear the shutter). So I believe that there is still a fair bit of testing required before we can write off the use of lenses via adaptors on the A7/A7R.
As an aside re Jaimie's comment on curved sensors, both the sensors themselves and lens designs for them have already been patented, and by Sony, no less.
Posted by: John Flower | Thursday, 17 October 2013 at 09:59 AM
"Please do note that the corner colour cast follows the RGGB quad layout symmetry of the Bayer... The birefringence theory that many Leicaphiles state as the cause is purely fictional. Cell-cell leakage before the rays hit the sensor surface proper is the mathematical model that the Leica firmware correction is built after..."
Great and accurate assessment.
"What about the refractive index of film?"
The emulsion is at most a few tens of micrometers thick, not millimeters, and the refractive index of the gelatin is close to unity.
Posted by: Semilog | Thursday, 17 October 2013 at 09:59 AM
@ Oskar Ojala
> Personally, I'm getting stellar results on a Nex-7 with a ZM Biogon
> 35/2 and Planar 50/2
Typical normal — e.g. the Planar 50 — and tele lenses, even for rangefinder cameras, have a quite long exit-pupil-to-image-plane distance. The rays are therefore less tilted, and the aberrations induced by the cover glass are therefore much smaller, especially on an APS-C sensor like the NEX-7 compared with a full-frame sensor.
Consider these two dimensions:
1) the length between the base of the light cone emerging from a lens, and the apex of said light cone. The base of this cone is located at the exit pupil, and its apex is located on the imaging plane. The length of the light cone is the "exit pupil to image plane distance"
2) the diameter of the light cone's base. This is simply the apparent diameter of the exit pupil — in this case, the apparent diameter of the opening formed by the aperture blades when the lens is set at f/2.8
Set your Planar 50mm and the Biogon 35mm to the same aperture, say, f/2.8, and look at them from behind.
By the very definition of a lens' f-number, the ratio of the light cone's length to the light cone base's diameter must be 2.8.
Suppose, for the sake of argument, that the apparent diameter of the exit pupil of the Planar 50 set at f/2.8 is 18mm.
Then, necessarily, the light cone's length, and therefore the Planar 50's exit pupil to image plane distance, must be 2.8 * 28mm = 50.4mm
Now, suppose, again for the sake of argument, that the apparent diameter of the exit pupil of the Biogon 35 set at f/2.8 is 13mm
The Biogon 35 would then have an exit pupil distance of 2.8 * 13mm = 36.4mm
These numbers are for illustration purposes only, but indicate how one could tell, simply by comparing the apparent sizes of the exit pupils when the lenses are set to the same aperture, which lens has the shorter exit pupil distance and is more likely to suffer from cover glass and OLPF-induced aberrations.
Anyway, based on your experience, it seems that the Biogon 35 has an exit pupil distance that is long enough that the tilt of the principal ray it emits towards the corners of an APSC-sized sensor is pretty limited,
=================================================
@ David Brown
> Two years ago I spent more money than my whole collection of 35mm
> gear cost for a digital body [..] Oh, and now I find out my old film
> lenses really won’t work that well on digital bodies
If your lenses could be used on a Minolta Maxxum and Sony A850, they are necessarily SLR lenses, not rangefinder lenses.
My comment was about lenses that have a short exit-pupil-to-imaging-plane distance, as is often the case with wide-angle rangefinder lenses designed in the film era.
SLR lenses, even if they are wide-angle, necessarily have quite long exit pupil distances, as their optics must, at the very least, physically clear an SLR camera's reflex mirror box.
SLR lenses, as well as rangefinder lenses with focal length >= 50mm, are thus of no concern in my discussion.
=================================================
@ Auntipode
> If I understand the explanation, for photographers who prefer to
> see with a longer than normal lens, as I do, I suspect it isn't as
> much an issue.
Prezactly. Lenses with a focal length >= 50mm, even those designed for rangefinder cameras, tend to have a quite long exit pupil distance, which minimizes the tilt of the light rays even on a full-frame sensor's corners, and this reduced tilt minimizes the aberrations induced by the cover glass or OLPF stack.
=================================================
@ Kevin Purcell
> The thinness of the cover glass/OLPF/IR filter over the Leica
> digital M mount cameras has been mentioned by Zeiss in one of
> their lens papers.
Very interesting paper indeed. Thanks for the pointer !
Quoting from page 12 of that paper (emphasis mine):
"Lenses with a very large beam tilt react in a much more sensitive manner to a change of refractive index in the image space [..] If the filter plate is not considered in the design of the lens, the edge definition will suffer. The effect of the additional path through the glass grows exponentially with the beam inclination"
The excerpt above highlights the issue that piqued my curiosity:
1) if the effect of the additional path through glass grows exponentially, how come Leica successfully managed to increase the thickness of the sensor's cover glass from 0.5mm (on the Leica M8) to 0.8mm (on the M9) ?
2) increasing the sensor's size from the M8's APS-H to the M9's full-frame will necessarily increase the tilt of the rays in the image corners; in theory, lenses should then react in a "much more sensitive manner" — i.e. the IQ must presumably degrade in the corners of the larger sensor.
3) old rangefinder lenses designed in the film era certainly did not consider the effect of a filter plate or sensor cover glass. Yet, Leica's clients presumably expect (or hope) such lenses to perform at least adequately even on a full-frame Leica M9.
Hence my suspicion that some kind of technological breakthrough — e.g. in-camera deconvolution — enabled Leica to adequately support their film-era lenses, thereby alleviating the need to quickly rework their optics to make them more compatible with digital sensors — e.g. by making them more telecentric.
=================================================
@ Kevin Purcell
> This also seems to be the primary reason Leica omitted an OLPF
> (anti-aliasing filter) from their cameras [..] it seems to have
> started a trend even though camera companies that omit it use a
> thick cover glass (e.g. Fujifilm X-Pro 1 has no OLPF but a 2.5mm
> cover glass... ugh!)
Omitting the OLPF reduces aberrations, but an additional benefit is that it improves the system's spatial frequency response, and hence, the image's achievable resolution.
As I explained before, an OLPF based on birefringence applies a cosine-shaped envelope to the sensor's frequency response.
An ideal OLPF, on the other hand, should look like a brick wall in the frequency domain, with no attenuation right up to the Nyquist frequency, and total attenuation beyond Nyquist. Ideal brick wall LPFs tend to be acausal, and are generally not realizable in the physical domain.
The optical attentuation in the high spatial frequencies caused by the birefringent OLPF's cosine-shaped envelope can, in theory, be reversed by the downstream signal processing — see it as a de-emphasis, followed by a re-emphasis.
In that sense, it would be conceptually similar e.g. to an analog, vinyl record's RIAA equalization process, where pre-emphasis is followed by de-emphasis.
Still, the OLPF's de-emphasis irretrievably degrades some high-frequency signals, due to noise and the limited precision of the pixel encodings. It may thus make sense dispensing with the OLPF altogether so that the frequency response is flat up to the Nyquist frequency, thus preserving fine image detail.
This, I guess, is the rationale of Fuji's LPF-less approach. Fuji's pesudo-random X-trans pixel layout can, in some sense, be considered as a noise shaping method, where the aliasing noise — a.k.a. moiré — caused by frequency components above Nyquist is spread over a larger frequency band to make it less quasi-periodic, and therefore less immediately perceptible.
=================================================
@ Kevin Purcell
> The IR filter is a multilayer dielectric mirror (tuned for the
> IR) on top of the cover glass. It's not the thickness of the
> cover glass that provides the IR filter but the much thinner
> multilayer on top of it. It's reflective not absorptive.
Unfortunately, the transfer function of dielectric filters is quite dependent on the ray's incidence angle.
A dielectric stack that reflects infrared when the ray is at normal incidence, can start reflecting rays even in the visible spectrum - e.g. red — when the ray is tilted.
The resulting angle-dependent spectral content alteration can lead to color casts, which is an undesirable risk with lenses that can emit light cones with a wide range of tilt angles — e.g. lenses with a short exit pupil distance.
Dielectric filters are composed of layers whose thickness is of the order of the wavelength of the light rays involved, and their total thickness is but a few microns.
The assertion that IR rejection on a Leica sensor is entirely done using thin, micron-thickness dielectric filters is thus also inconsistent with the fact that Leica decided to increase the cover glass thickness from 0.5 to 0.8mm.
This, IMHO, is why Leica uses an IR-absorbing glass, and the reason why Leica increased its camera's sensor cover glass thickness from the M8's 0.5mm to the M9's 0.8mm, thereby eliminating the cause of complaints e.g. as regards the rendering of black synthetic fabrics.
Posted by: Bruno Masset | Thursday, 17 October 2013 at 10:17 AM
The refraction due to cover glass should be consistent, though, not random. Your argument for a 5-pixel dispersion seems to be based on random refraction.
Posted by: David Dyer-Bennet | Thursday, 17 October 2013 at 11:52 AM
Good evening to you all from Mumbai.
Makes me wonder what are we striving for? Great images or great image quality? Two are very different and don't go hand in hand. Infact, most shots worth taking are technically challenged anyway. They pose problems such as being too far, having a wrong lens mounted, low light levels/slow shutter speeds, can't stop down for maximum sharpness :), ISO 3200, not possible to see the frame...erc. etc.. i think getting bad corner is hardly one of them.
Right now i am browsing some great shots taken in 2012 and i find that most of them have been shot wide open...mostly because that was the only way you could take the shot in the first place.
I think technically perfect images can be a bit sterile. Block some shadows. Add some vignetting. We are talking about art, are we?
I routinely deliberately flare my lenses. I love those color aberations too. What a look. I think being able to use those old lenses is a great thing. They will bring some of the character they are known for into the image. Well, to me, aberrations = character when it comes to old lenses.
I know this is more of a photojournalistic point of view...but these small cameras beg to be used for this kind of photography anyway. You buy these toys for that. And you buy them because they are full frame.
We have our Nikons for the studio.
Posted by: Anurag Agnihotri | Thursday, 17 October 2013 at 11:53 AM
Dear Chris,
The total emulsion thickness in modern films is barely a tenth the difference in thickness between the different cover plates that Bruno is writing about. So, even if the refractive index argument applied to film as well (and it doesn't) the size of the effect would be one tenth as great, or just a few microns in terms of on-film resolution.
So, no, it's not something you have to pay attention to with film and it doesn't relate to Bruno's concerns (about which I have no opinion).
A possibly-interesting side tidbit, though. Towards the end of the film era, using the sharpest camera lenses with the sharpest 35mm films, one COULD see the effect of emulsion thickness on image sharpness. That is, at the ultimate limits of achievable resolutions, the plane of focus was thinner than the emulsion thickness of the films. With color films consisting of multiple layer stacked on top of each other, one could look at the resolutions achieved in the different layers and see this phenomenon, if one knew what to look for.
Not of any practical import, but entertaining.
pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
======================================
-- Ctein's Online Gallery http://ctein.com
-- Digital Restorations http://photo-repair.com
======================================
Posted by: ctein | Thursday, 17 October 2013 at 03:06 PM
@ Charles
> The five-pixel example remains a hypothesis until you know the
> refractive index of the glass over the sensor. Are we to believe
> this is a new issue for the digital camera manufacturers?
A common type of optical-quality IR absorption glass — Schott's KG3, which is probably a ZK7-type glass with ionic implants — has a refraction index of around 1.5, and I thus selected that RI value as a realistic example.
AFAIK, there's no public information indicating that Leica uses some "exotic" optical material with a very low RI — e.g. fluorite — to seal the sensor's package.
The exit pupil issue doesn't really affect other camera manufacturers, as SLR lenses tend to have quite long exit pupil distances due to the simple fact that the lens' optics must lie beyond the reflex mirror.
=================================================
@ GH
> Mr. Masset is only making assumptions and is incorrect about
> Leica's IR filter. Oddly enough, I was just reading a post from
> a few years ago about Leica's IR filter, from a gentleman who
> works in the industry:
>
> "The Leica IR filter is a thinner version (0.5mm) of the standard
> plate of the cheapest possible material, the Kyocera BS7."
Consider these factors, which laypeople might not be aware of, but that might well be relevant when selecting the material to seal a sensor package:
1) the chemical compatibility of the selected IR absorption glass with e.g. the (epoxy ?) adhesive used to bond the glass to the sensor's (ceramic ?) package
2) the thermal expansion ratio compatibility between the selected cover glass material and the sensor's package
3) the thermal resistance of the cover glass material, as the circuit board carrying the glass-covered sensor may actually go through a fairly hot reflow oven to solder the PCB's components.
Laypeople and random Internet forum denizens tend not to be aware of such factors, in particular because they are irrelevant for DSLRs: there's a high probability that DSLR sensor packages, unlike Leica's sensors, are covered with plain glass, instead of IR-filtering glass.
The fact that IR conversions of DSLRs — e.g. for astronomy or IR photography applications — are possible at a fairly low cost is a strong indication of this. It's highly unlikely that those shops that offer DSLR IR conversion services go to the trouble of removing a bonded cover glass from a sensor's package. Where would they perform such a task anyway ? How would they prevent the contamination of the sensor's surface ? Do such conversion shops have access, say, to a semiconductor factory-class clean room ?
It's thus far more likely that with most DSLRs, the IR filter is an element that's distinct from the sensor package, and that can be fairly easily removed and replaced with normal optical glass to make the camera sensitive to IR.
Therefore, it's unlikely that considerations like IC chip package bonding material compatibility, thermal expansion coefficient compatibility e.g. with a ceramic chip package, or reflow soldering temperature resistance / optical properties stability of IR glass material would be present, even remotely, in the list of parameters a random Internet forum denizen would consider when pontificating about the particular choice of an IR-filtering optical material by Leica or Kodak.
=================================================
@ GH
> Loss at 25º incidence (with compensating microlenses) is more
> than twice that of comparable Canon/Sony/Nikon sensors - mostly
> due to the depth of the cell structure and the low fill-factor.
CCD sensors do not require the per-pixel transisitors (select, reset, buffer etc.) that are present on a typical CMOS sensor.
A CCD chip's layering (metal interconnects and insulating layers above the semiconductor substrate) thus tends to be thinner than with CMOS sensors.
A consequence of the thinner layer stack above CCD sensors is that they have wider ray acceptance angles, as the photodiode sits closer to the chip's surface, instead of being buried in the canyon formed by the various interconnecting layers.
In fact, the thickness of a CMOS sensor's interconnect layers, and the resulting restriction of acceptance angles is one of the factors that have pushed the development of back-side illumination for CMOS sensors.
I'd also be interested to see evidence of your peculiar assertion that Canon/Sony/Nikon use CMOS sensors have an acceptance angle reaching 25º.
It so happens that DxO has tested the incidence of lens aperture on a sensor's sensitivity.
Quoting from DxO's site:
"Some sensors can have a loss of more than 1 Ev at f/1.2."
This means that with some sensors, a lens that emits a f/1.2 light cone doesn't generate any more photoelectrons in a pixel than a lens that is 1EV slower than f/1.2 — i.e. f/1.7.
Imagine a thin circular lens that has a diameter of 41.6mm, and a focal length of 50mm. The light cone it emits thus has a base diameter of 41.6mm, and a length of 50mm. By the very definition of a lens' f-number, such a lens is thus a 50mm/41.6mm = f/1.2 lens.
The apical angle (angle at the apex) of the f/1.2 light cone is thus simply 2 * arctan ( .5 / 1.2 ) = 45.24º.
The acceptance angle required for a pixel located at the cone's apex to successfully catch all the light contained in such a light cone must thus be at least than 45.24º/2 ~= 22.62º
The apical angle of a f/1.7 light cone is 2 * arctan( .5 / 1.7 ) ~= 32.8º ; the acceptance angle required to entirely catch such a light cone is 32.8º / 2 = 16.4º
The implication of DxO's measurements is that the outer part of the f/1.2 light cone that lies beyond the f/1.7 light cone is, in fact, not generating much photoelectrons — i.e. the cone's outer part is not "seen" by the pixels.
This strongly hints that with many CMOS sensors, the pixel's acceptance angle doesn't reach much beyond 17º.
Note that unlike these CMOS sensors, Panasonic's LiveMOS sensors, used on some Micro 4/3 cameras, and the CCD sensors used by Leica seem to have acceptance angles much larger than 17º, and might well reach into the 27º+ territory necessary to support e.g. lenses with a f/0.95 aperture, like the Leica Noctilux 50mm f/0.95 and the Voigtländer Nokton 17.5mm, 25mm and 42.5mm f/0.95.
Neither the Noctilux nor the Noktons have electrical contacts that communicate the aperture selected on the lens to the camera body on which they are mounted, and influence the camera's metering system.
As there haven't been loud complaints on the photography forums that changing the aperture from f/1.4 to f/1 or f/0.95 on these lenses didn't cause any exposure changes on the relevant cameras (Leica and MFT), one surmises that LiveMOS and CCD sensors — unlike, apparently, the CMOS sensors tested by DxO — have pixels with acceptance angles that allow them to catch the very wide light cone emitted by f/0.95 lenses.
=================================================
@ thomas hobbes
> the astigmatism caused by refraction through cover glass is really
> only a problem for wide angle symmetric rangefinder lenses and it's
> just as much a problem for leica as it is for sony.
The NEX E-mount being fairly recent, Sony, unlike Leica, doesn't have to consider, when introducing E-mount cameras, their compatibility with legacy lenses designed in the film era. Sony thus has less restrictions as to the thickness of the filter stack that it can put in front of the sensor.
To prevent aberrations, Sony's lens designers simply have to take into account the optical properties of their digital cameras' filter stack.
Sony could thus, in theory, design wide angle symmetric lenses with a short exit pupil distance if the optical stack's aberration was the only factor to take into account in their raytrace calculations.
Of course, as tilted rays also engender pixel acceptance angle issues and inter-pixel crosstalk and color contamination issues, one surmises that lens designers of the digital era will tend to prefer telecentric-ish lenses with fairly long exit pupil distances.
The just announced Sony / Zeiss Sonnar FE 55mm f/1.8, for example, appears to be a retrofocus — a.k.a. Distagon — design, despite its normal-ish focal length, and the short flangeback of Sony's E-mount which would seem to have allowed for a more conventional, non-retrofocus — e.g. double Gauss, Planar-like — design.
Retrofocus designs tend to have an entrance pupil that's smaller than the exit pupil; a larger exit pupil is generally indicative of an increased distance between the exit pupil and imaging plane, and therefore of reduced principal ray tilt. This may well have been a consideration for the Sonnar FE 55mm's designer(s).
=================================================
@ Chris Malcolm
> What about the refractive index of film? The sensitive chemistry
> isn't laid on top of the film
135-format film has typically a thickness of about 1/8mm — i.e. about 125 microns.
The sensitive chemistry
is
laid on top of the support base, and generally accounts for less than 10 microns of the film's thickness.Compared to the thickness of a sensor's cover glass — e.g. 800 microns, — the light cone dispersion effects that depend on the principal ray's incidence angle within a 10-micron thick emulsion layer would be vanishingly small, in the order of a fraction of a micron — i.e. smaller than the grain size of most films...
=================================================
@ David Dyer-Bennet
> The refraction due to cover glass should be consistent, though,
> not random. Your argument for a 5-pixel dispersion seems to be
> based on random refraction.
The fact that a physical phenomenon depends on several variables doesn't necessarily make it "random" or unpredictable or unmodellable. The simple laws of geometric optics that are involved here, and their consequences, are definitely not something that is "random".
Ignoring the constringence effects that are quite negligible due to the limited thickness of a cover glass, the refraction index (RI) of an optical material can be considered constant for the purposes of this discussion.
The fact that the RI is constant doesn't mean, however, that the effect of the cover glass on all lenses will be constant.
As mentioned, the magnitude of the effect will depend, among others, on:
- the exit pupil's distance
- the exit pupil's diameter, that is, the f/stop the photographer has chosen
- the cover glass' thickness
- the distance from the center of the imaging plane; the cover plate induces no aberration at the center of the imaging plane, whilst the aberrations are going to be quite large in the image's corners.
The 5-pixel dispersion figure was a numerical example, based on realistic parameters, intended to illustrate the magnitude of the problem in a realistic scenario.
Obviously, the magnitude of the dispersion might be smaller — e.g. if the lens is stopped down beyond the f/4 value chosen for the example — or larger — e.g. if the exit pupil distance is shorter than the 30mm figure chosen in the example.
The calculation of the refraction of a tilted light cone through a plane-parallel optical slab is quite trivial, and can be reproduced by anyone who knows what the "sine" of an angle means, and understands Snell's law.
Posted by: Bruno Masset | Thursday, 17 October 2013 at 05:28 PM
The fact that IR conversions of DSLRs — e.g. for astronomy or IR photography applications — are possible at a fairly low cost is a strong indication of this. It's highly unlikely that those shops that offer DSLR IR conversion services go to the trouble of removing a bonded cover glass from a sensor's package. Where would they perform such a task anyway ? How would they prevent the contamination of the sensor's surface ? Do such conversion shops have access, say, to a semiconductor factory-class clean room ?
Actually the low pass filter is indeed all on a piece of glass that sits over the sensor. The Bayer filter is printed on a substrate on top of the sensor chip itself. The microlenses are printed on top of that layer. The general design for most CCD/CMOS sensors with a cut filter/AA filter is the same. For IR conversion they typically simply remove the cover glass from the sensor and replace it with one that has an IR only filter or they just use plain glass which results in a full spectrum camera. They use slightly different filters IIRC for astro. You can actually remove the cover glass and scrape off the microlens/bayer filter layer and end up with a monochrome sensor. In light of this article, it makes me wonder if that was some of the motivation for the the Leica Monochrome.
As far as clean rooms go, these shops typically tear down and fix DSLRs. Dust free environments are critical for this kind of work. I would imagine the sensor is blown with air extensively before the cover glass is replaced and that it is tested for dust entrapment. This is actually one way to fix DSLRs that have dust trapped below the low pass filter without replacing the sensor.
Posted by: Zos Xavius | Thursday, 17 October 2013 at 07:46 PM
Daaammn. Bruno's comments are the best comments I have ever read on TOP. Thanks for writing all that, Bruno! And thanks for publishing it, Mike. There is more knowledge and info in his comments than in most full blown articles on all photo sites/blogs.
Posted by: Ed | Thursday, 17 October 2013 at 08:54 PM
Mike - in addition to a "satire alert", you might need include a "technical alert" on your posts. As a unit a measure, can I suggest "Cteins" ;-)
A 1 Ctein article is mildly technical and something rated at 3 Cteins is bordering on technobabble.
Posted by: Sven W | Thursday, 17 October 2013 at 09:45 PM
"SLR lenses, as well as rangefinder lenses with focal length >= 50mm, are thus of no concern in my discussion."
Thank you, Mr. Masset. That clears that up! :-)
Posted by: David Brown | Friday, 18 October 2013 at 10:10 AM
I agree with most of the post and the theory behind is convincing.
However, I don't think that Leica does any in-camera deconvolution in the M9: I used extensively my Summicron 35 ASPH on the M9, taking pictures in RAW and comparing the results with and without the in-camera lens correction.
I am pretty sure that the only differences I found were in the vignetting and in the color cast on the borders. I did not notice any difference in sharpness (and the lens is *very* sharp).
I could assume that Leica have been using the 6-bit code for the deconvolution even when it was disabled, but I happen have several previous pictures done with the same lens, without the 6-bit flange (if was added later), with no in-camera correction selected, and very sharp up to the borders.
I also used my Elmarit 90 without 6-bit coding several times on the M9 mistakenly set to the 35mm Summicron ASPH, and there were no apparent artifacts at the borders caused by an incorrectly applied deconvolution.
Finally, the M9 does not have a precise idea of the working aperture (it computes it by comparing TTL light with an external sensor), and the deconvolution function to apply would depend on the precise aperture to be effective, because the amount of diffusion due to the glass in front of the sensor depends on the lens aperture.
I also used the same lens on the Fuji X-E1 with the Fuji adapter, and the results, even on an APS-C sensor, were terrible: the borders were smeared, and no information was preserved under 10 pixels. Diffusion decreased with aperture until it was invisible at f/8.
So, from my observation, the Leica M9 does not perform in-camera deconvolution, the 0,8mm glass in front of the sensor has no effect on the borders on the full frame sensor, and the 2-2.5mm glass on the APC-C sensor of the Fuji X-E1 makes the image unusable.
I know that these observations don't fit together very well, but I don't have a good theory until now.
Posted by: Roberto Giaccio | Saturday, 19 October 2013 at 05:28 PM
"just one interface. As an adapter adds two more to the system,"
Uh, I think it adds just one, doesn't it?
Doug C
Posted by: Doug C | Sunday, 20 October 2013 at 08:35 PM
@ Roberto Giaccio
> from my observation, the Leica M9 does not perform in-camera
> deconvolution
My understanding of Leica's 6-bit coding system is that it only affects the color shading and vignetting correction, and has no effect on image definition / smearing.
There's also little reason to assume from the outset that Leica's method of using a non-TTL sensor to estimate the aperture value set on the lens isn't able to deliver average exit pupil size estimates that are "good enough" for a (still purely hypothetical) deconvolution performed on what is after all, a consumer camera, not a precision scientific instrument like e.g. a astronomical telescope.
Physics — in partcular, Snell's law — dictate that a 0.8mm cover glass must have a significant smearing effect; the magnitude of this effect in realistic scenarios is quite trivial to calculate.
If such an smearing effect can't be observed, it makes more sense to conjecture that a corrective means — e.g. a deconvolution algorithm — might be present in the system under test, rather than to assume e.g. that Leica has developed some magical, refraction-less cover glass optical material, to which fundamental laws of optics therefore don't apply.
A deconvolution process will increase the pixel's autocorrelation with its neighbors. In other words, if a deconvolution process is present, a pixel's signal is necessarily going to influence the signal level of adjacent pixels, "smearing" into its neighbors.
Such a smear would obviously be difficult to distinguish from the smear — a.k.a. point spread function — caused by the cover glass.
How could one thus identify in the image a smear that's more likely to be caused by the autocorrelation widening that accompanies a deconvolution process, rather than by the cover glass ?
One way would be to generate a reference, very fine image detail — preferably one pixel wide — on the sensor, without any kind of aberration / smearing caused by the lens and the cover glass. The sensor signal should then be injected into the camera's pixel processing pipeline, to see if that very sharp, reference feature comes out smeared in the raw DNG file, or not.
As physics dictate that the cover glass must generate smearing, it's obviously a bit difficult to project from across that cover glass — e.g. by taking a picture of a test chart with a very sharp lens — a reference optical pattern, expecting it to be projected non-smeared on the sensor.
Fortunately, "generating" a smear detection test pattern on a sensor doesn't necessarily require optical projection: one could exploit some intrinsic physical characteristics of a semiconductor-based image sensor:
1) The first characteristic that comes to mind is a pixel's intrinsic noise. A pixel with a strong noise signal, for instance, is going to be smeared by the deconvolution process into its neighbors. One can thus expect the autocorrelation measurements of pixel values in raw DNG files to widen as one moves from the image center towards the corners, where the smear / PSF to be corrected, and hence the deconvolution radius, increases.
2) Another sensor characteristic doesn't necessarily require autocorrelation calculations on a large number of pixels, and can thus be more easily checked visually: a sensor's bad pixels.
Most image sensors contain a number of "bad" pixels. These defective pixels generally occur in isolation — i.e. "contiguous clusters" of bad pixels are rare, especially as the presence of such a cluster would, in general, be grounds for the sensor manufacturer to discard that sensor as defective and unfit for sale to its clients.
If, for example, a bad pixel is stuck at max brightness, a deconvoluton process would tend to smear that high brightness into adjacent pixels.
One thus expects that a hypothetical deconvolution should tend to produce, in Leica's raw DNG files, a fuzzy contiguous blob around each bad pixel found in the sensor's periphery, whilst the bad pixels at or near the image center — where little or no deconvolution is needed — would tend to remain sharp and isolated.
> I also used the same lens on the Fuji X-E1 with the Fuji adapter,
> and the results, even on an APS-C sensor, were terrible
Snell's law dictates that a slab of optical material — regardless of whether it's just optical glass, or a birefringent OLPF stack — will induce a spatial shift of light rays that will depend on the rays' incidence angle.
Lens designers must obviously take the effects of such a slab into account in their raytrace so that the tilted light cones emitted by a non-telecentric lens remain stigmatic.
The Fujifilm X-A1 has a 16MP APS-C sensor with a color filter array that uses the conventional, moiré-prone Bayer layout. The X-A1 thus presumably has — just like the 16MP APS-C cameras made e.g. by Sony and Nikon — a birefringent OLPF to prevent the apparition of moiré.
Fujifilm's X mount lenses must be compatible with all X mount cameras, regardless of whether they have a Bayer filtered sensor and an OLPF, or an X-trans sensor and no OLPF.
The implication is that Fujifilm's OLPF-less cameras — e.g. your X-E1 — must have, in front of the sensor, a layer of fairly thick optical material whose optical path length is identical to the X-A1's birefringent OLPF's.
A similar reasoning appllies to Sony's LPF-less 36MP A7R, which must maintain compatibility with all the E mount lenses designed for Sony's OLPF-equipped cameras. The A7R thus presumably has a fairly thick slab of optical material in front of its sensor.
There's thus, IMHO, little reason to expect that film-era rangefinder lenses with a short exit pupil distances will smear any less on the A7R than on the OLPF-equipped Sony A7.
Posted by: Bruno Masset | Sunday, 20 October 2013 at 09:03 PM