« Sharp Deal | Main | Lens from the Times »

Monday, 18 May 2009

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

The pictures you posted compare the $826 Zeiss to a $300 Canon lens. Readers should know that Canon also offers the more expensive ($1,235) and presumably higher quality EF 35mm f1.4L USM autofocus lens.
.
Prices are from the B&H website.

Speed,
The illustration for this post is just one of dozens from Sean's review, which include multiple comparisons. No need to be upset.

Mike

Pete Myers has also written about this lens over at Red Dog Journal, as part of his series about transitioning to digital.

His first comments here:
http://www.reddogjournal.com/PM-2.php

Here's the second column, a little less thrilled with digital and/or this Zeiss 35:
http://www.reddogjournal.com/PM-4.php

Just an interesting tidbit to consider.

I just read the Pete Myers post, and it is the the first time I have heard the assertion that "The CA needs to be removed from the image before demosaicing"

Is anyone else claiming this? I can think of some good reasons to argue the opposite, mainly that it would be easier to remove the sensor's artifacts before image manipulation. This sounds so wrong that there must be some reasoning behind it that I am unaware of.

On the other hand I like the term "angry bokeh" , it reminds me of an electric guitarist that said that a particular combo of pickup an tubes sounded "petulant"

I've been using this lens in ZS (M42) mount for a few months now so that I can use it on both my Canon EOS and FD bodies, and I've been very pleased with it. Color rendition is excellent, and it's sharp where it needs to be sharp and soft where it needs to be soft, and as someone with no interest in autofocus, I appreciate the sturdy build and precise feel of the focusing helical, which I don't get from the Canon EOS lenses.

"Is anyone else claiming this? I can think of some good reasons to argue the opposite, mainly that it would be easier to remove the sensor's artifacts before image manipulation. This sounds so wrong that there must be some reasoning behind it that I am unaware of."

It seems to me that you can't get much earlier in the chain than "before demosaicing."

talking about MF lens, If you never tried the Pentax 35mm K F2 from 76-77(8 elements in 7 groups),I hope you will find one and test It just for fun.

It was made during a short collaboration from Pentax and Carl Zeiss...It has long barrel too(with the K 28mm F2).
Pure jewels. I make 40% of my images with this 35mm.

Congrats for your Website...The first I check everyday.

Would it be possible to identify CA before demosaicing?

Dear Hugh,

I found the comment curious, although I couldn't say that it was wrong. The problems are that he isn't making technically precise statements, so I'm not sure what's going on. "Chromatic aberration" is a term that's regularly misused today to describe any kind of color fringing that appears in a photograph. There are many sources of color fringing that aren't due to chromatic aberration; without knowing the source of this problem, it's hard to say whether he's right or not.

In addition, he mentions using some experimental RAW converter. Who knows what that was doing? I can say that doing any significant amount of sharpening BEFORE correcting color fringing is asking for trouble. It's similar to the problem you can get oversharpening an image for output: you get "sparklies" along edges that simply should have been sharpened, because modest excursions from the norm get highly exaggerated there.

There also seems to be something wonky with his workflow; I can't conceive of any way that the "Backseat Betty Revisited" photo should have required over 60 hours of work to do, even of he were going in and cleaning the picture up pixel by pixel (which I routinely have to do on film scans that are many times bigger, and it's incredibly tedious but doesn't take anything like 60 hours).

I can say that using ACR to correct the color fringing in my Fuji S100fs photos, which exhibit extremely severe "chromatic aberration," works very well. And it works better the less sharpening I do during the RAW conversion stage.

Whatever's happening to him, I have the feeling it doesn't have much relevance for the rest of us. I'm not denying his experience; I am opining that not many people will recapitulate it.


~ pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
======================================
-- Ctein's Online Gallery http://ctein.com 
-- Digital Restorations http://photo-repair.com 
=====================================

Regarding a monochrome sensor vs. a Bayer array:

My understand is that the sensor pixels are all the same, but the Bayer pattern is produced by a filter. Can this filter be removed? I know that IR filters can be removed, but they're purely optical. The Bayer filter is in a 1-1 correspondence with the pixels, so presumably it's on or very close to the silicon (or whatever sensors are made of). Still, I'm wondering if in a proper clean room with all the right skills and equipment it could be removed. If so, it might be cheaper to convert an existing Bayer-array sensor to monochrome than it would be to produce one from scratch, because starting from the high-volume Bayer-pattern sensor would provide economies of scale.

Then all we need is for some manufacturer to decide that there's a market for monochrome cameras (or backs).

It appears, though, that Myers is hoping that dividing the Bayer pixel count by 4 will get him to where he wants to be. I'm sure he wants more than 6 MP, which will take a few more years. Perhaps then a camera with an 80 MP sensor will be cheaper than a Bayer sensor converted to monochrome using the method I'm imagining (see above) might be possible.

But we may never see 80MP in a 35mm format, simply because there won't be enough demand.

--Marc

I suspect that if you had enough information about the lens you could do a lot of neat things.

Mmmmm, so you're working with a Pentax 15mm DA Limited? You wouldn't happen to be using it on a K-7, would you?

Don't answer that for another 1 day 8 hours....

;-)

One of the sayings I learned as a young software engineer was, "A poor craftsman blames his tools." Not that Myers is necessarily a poor craftsman, but that he hasn't mastered his new tools.

His image has a pretty wide dynamic range. Perhaps he would have benefited from taking two images, one open two stops more, and using something like Zero Noise to bring back some detail in the shadow areas. Or, some strategically-placed strobes, or reflectors, etc. (I prefer to manipulate the camera's exposure rather than bring extra gear.) I also have to wonder if his raw processing is causing him trouble downstream. Seeing the image straight from the camera with nothing besides default processing would be helpful.

Digital cameras aren't just analogs (pun intended) of the film cameras they replace. They require some understanding of the underlying technology to get the most out of them... just as an understanding of film and how it reacts to exposure is necessary to get the best from film photography. Unless you're happy with mediocrity, you can't just use the same exposure techniques and rules of thumb that worked so well with chromogenic color film.

Yo, Mike, can you be even more head-over-heels with this lens than you are with that Pentax DA 35 mm macro? Really? Isn't that a form of infidelity?

Hi Mike, it'd be great if you gave a little writeup on the DA15 Limited lens and your experience with it :)

Dear Mike:

Your worst nightmare has finally come true:

http://acquine.alipr.com/

And yes, it gave henri a 1-star rating.

Ctein, I have also seen Mr. Myers mention his exceedingly long and laborious workflow before (on Michael Reichmann's site, to be specific, see link below) and it caused me to wonder - I might spend a fair bit of time on an image, but anything exceeding two hours is from me tinkering or playing with various adjustments, not working steadily through a pre-defined workflow. As such, I tend to find his images have an "over-cooked" appearance, which is not often seen in black & white.

http://www.luminous-landscape.com/essays/making-images.shtml


re the post by Pete Myers re CA.
I do not have a clue as to when CA should be removed. But i would like to comment on one aspect of CA.
It is my understanding that with CA (specifically, lateral CA)the problem is not only the apparent color error. Because the light wavelengths are not focusing at the same point, the area of the image wherein the error appears is also not as "sharp". I do not know if software correction deals with that.

"If so, it might be cheaper to convert an existing Bayer-array sensor to monochrome than it would be to produce one from scratch, because starting from the high-volume Bayer-pattern sensor would provide economies of scale."

Marc - First, as far as I know, the filters are in fact manufactured right on top of the sensor. They are essentially part of the sensor chip, not an addition to it the way you might be thinking.

Second, nothing involving a clean room is ever cheap. And in this case, I think the setup to de-layer the filters off the sensor (if that's even possible) would be complicated, likely to not work very well, and probably very expensive to set up in the first place. It is possible to peel layers off of semiconductors, but it's usually only done for debug, because you're likely to introduce lots of issues.

It would probably be better/cheaper to just change a current fab to leave out the bayer filtering.

The big inhibitor to more demand for the Zeiss SLR lenses is the lack of autofocus. Certainly there are many applications where manual focus is preferred, but the majority of the market would prefer having this feature available, at least. Does anyone know if there are plans on the horizon for a Zeiss series with AF?

Would it be silly to ask why people want auto-focus? Am I the only person who is concerned with hyper-focal distance and not centering the main subject. My only conclusion is that either there are a lot of sports photographers out there, or people are just using their (very) expensive DSLRs to take pictures of their kids--badly.

Dear Jay,

You have the chromatic aberrations mixed up. Lateral chromatic aberration occurs when the lens produces images of different magnification for different wavelengths. All the channel images are focused in the same plane, so they are all sharp, but there is misregistration between them that results in fringing that gets worse the further off-axis you get. Software is very effective at realigning the images and producing a sharp result.

Longitudinal chromatic aberration is probably what you were thinking of. That occurs when the lens focuses different wavelengths at different distances, so that there isn't a common plane of focus. If this is severe, color halos can appear because one color will be focused to a sharp point while another will be a fuzzy blur. This is not corrected by defringing software that is supposed to correct for "CA."

Those are not the only sources of "CA," though. For example, coma in one channel and not another can produce color fringing that is somewhat but not entirely correctable with the fringing software (see my Fuji S100fs review).

All three of these aberrations are lens-based and appear in film images just as they do in digital cameras. There are other sources of color fringing that are specific to digital cameras; e.g., the infamous "purple halo" problem, et cetera.

As I said previously, many photographers (and test sites, sad to say) don't make the distinction and lump all of these together as "chromatic aberration," even though that is not the correct term (only the first two are chromatic aberration).

It's extremely unlikely that the lens in question exhibits serious chromatic aberration. It could conceivably have some coma (not uncommon in wide-angle lenses); there could be other digital-camera-related sources of color artifacts involved. We don't know. That makes speculating on what is happening with Pete's images and why they're giving him such grief pointless.


~ pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
======================================
-- Ctein's Online Gallery http://ctein.com 
-- Digital Restorations http://photo-repair.com 
=====================================

Mike, I'd also like to see an inital thoughts from you on the 15mm ltd. How bout it? =)
if the 35mm zeiss was a 24, or 25, i'd be more interested.

Ctein: I understand the distinction you're making here:

"All three of these aberrations are lens-based and appear in film images just as they do in digital cameras. There are other sources of color fringing that are specific to digital cameras; e.g., the infamous "purple halo" problem, et cetera."

I'm wondering if sensors can exaggerate lens-based CA. Are digital sensors likely to make (real, lens-based) CA a worse problem than it would be on film, particularly in the corners of the frame? This seems plausible because sensors don't record light that is striking them at an oblique angle as well as film does. In the corners, this could cause them to exaggerate the problems with magnification, focus, or coma that you described. Do you know if this occurs?

"Yo, Mike, can you be even more head-over-heels with this lens than you are with that Pentax DA 35 mm macro? Really? Isn't that a form of infidelity?"

G,
Different focal lengths. Well, not literally, but effectively. I've only used the Zeiss on full-frame cameras, and Pentax only has reduced-frame cameras, where the 35mm is 52mm-e.

Mike

"Does anyone know if there are plans on the horizon for a Zeiss series with AF?"

@Tom

They're already being made, they're called CZ lenses for Sony (nee Minolta) a-Mount.

Dear Andy,

It's well established that off-axis light falloff becomes more serious with micro-lensed sensor arrays than with film. Other than that, I haven't seen any evidence one way or another on whether digital sensors make lens aberrations worse or not.

I have seen much opinion expressed about this. None of the opinions I've read have any facts associated with them, such as controlled comparison experiments or even a convincing physical analysis. So, I have no opinion on the matter. It is neither plausible nor implausible; it's just a blank space in my knowledge.


~ pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
======================================
-- Ctein's Online Gallery http://ctein.com 
-- Digital Restorations http://photo-repair.com 
======================================


Since I don't subscribe to RR, I'm wondering: did Mr Reid apply his famous "fruit from the grocery store" test with this lens?


This is the test where he carefully evaluates lens color rendition and artifacts like IR sensitivity using a bowl of fruit and then happily feeds his kids the fruit thus killing two birds with one stone.

Regarding the option of autofocus, I think that a lens designed to autofocus well tends not to focus manually as easily as a lens designed for manual focus, so I'm pleased that these lenses are manual focus only, and I probably wouldn't want to pay for a feature that I don't plan to use, even if they could design a lens that has both fast, precise autofocus as well as fast, precise manual focus.

ZS lenses will take Chinese-made aftermarket M42 adapters that are chipped so that they can be used with focus confirmation on autofocus cameras that offer that feature, though. I've tested one out on my Canon 40D, and it's quite accurate. I suspect the ZE mount lenses for EOS will also support focus confirmation.

The comments to this entry are closed.

Portals




Stats


Blog powered by Typepad
Member since 06/2007