« Olympus Shareholders Approve All-New Board | Main | Does DxOMark Matter? »

Friday, 20 April 2012


Feed You can follow this conversation by subscribing to the comment feed for this post.

..." (It indicates that the Mark III's improvements are mostly in the video realm, which aren't reflected in the DxO scores.)"

I don't own the 5Diii, but a friend who does is most happy with its significantly improved AF capabilities compared to the 5Dii. These improvements, along with other results, are well demonstrated in these two objective reviews on LL by real camera users...


Also, while DxO tests seem perfectly valid for what they measure regarding sensor performance, it's worth noting things that aren't measured with regard to overall camera performance (that may translate to image IQ), e.g., resolution, internal image processing, metering, white balance, etc.

I like to read and understand the tests. But, in the end, different cameras of course serve different needs, and for me the final print is where I draw conclusions about image quality.

Impressive tests. But why go to all that trouble gathering all those accurate data and then allocate a crude number score? One camera is 2 better than another and a third is 14 better again means what exactly?

Thanks for the link, Mike.

The megapixel myth was flawed from the beginning, because it didn't take output size into account.

If I have a 12mp sensor and a 24mp sensor that are the same size and sensor technology, of course the 12mp sensor printed to 9x14 will be less noisy than the 24mp sensor printed to 13x20 (their approximate prints sizes at 300ppi,) but what good does that information do when choosing between cameras?

The more useful comparison is to output each sensor to the same size, whether it be a 8x10 print, a 16x20 print, a web jpeg, etc. When doing this, one will find that the higher megapixel camera is as good or better than the lower megapixel camera at any comparable output size, and that makes the choice easy.

The real disadvantage of more megapixels is slower throughput (fps,) higher cost, and more computer storage required...not image quality.

I still want a 5D Mark III more than a 36mp camera. I like the Canon look, it works for me.

Mike, thanks for pointing out this report.
I'm a long-time DxO Optics Pro and FilmPack user, but several insights into the workings of DxO were news even to me.
The DxO detailed findings always deserve attentive reading. The only thing I never quite figured out is their summary scoring: is looks a bit contrived. But their relative assessment of the D800/E seems in broad agreement with the preliminary findings of Michael Reichmann and Lloyd Chambers, and very much so with Thom Hogan's predictions.

I certainly prefer to believe DxO's ratings over what's presented as "conventional wisdom" - if the results show some parameters of image quality increase despite the increased number of megapixels, that's fine by me.

It's a shame that I had a bad experience with DxO Optics Pro: I found bugs in licensing and file-size/memory restrictions and their support team was useless.

It's a good article; the only point I would add to it is that when you have a proprietary problem (vendor-specific RAW formats), you don't need proprietary software, whether it be ACR or DxO presented as an "alternative" - you can get great RAW conversions with open-source utilities, including taking lens profiles into account as well.

Umm ... what does "to buck" mean?

One point makes zero difference, but it's impressive that my K-5 competes so well with these megapixel champions. But if you want the lastest and video is your thing, I don't doubt the MkIII will do excellent work.

"Umm ... what does 'to buck' mean?"

To oppose directly and stubbornly; to go against. (AHED)


Dear John and Chris,

The aggregate score is "DxOMark for Dummies."

It's there because there were people who complained that the detailed technical analyses were too difficult for them to understand. So they came up with summary scores which condensed each category of data into a single metric, and the aggregate score is a weighted combination of those scores.

It's meaningless for folks who can understand the more detailed data.

One way to think of it is, "Here's how the votes of 10,000 typical photographers, each with their own individual needs, would rate this camera's sensor." It's pretty well irrelevant to any specific photographer (like thee and me) unless we just happen to embody the average characteristics.

BTW, you can find out exactly what the numbers mean if you're willing to drill down through their technical papers. They explain how they're derived and what a difference of X points actually means.

But, do you really care? I don't.

pax / Ctein

DxO. A very interesting article. I'm just not sure how Edward Steichen, Alfred Stieglitz, Edward Weston, Minor White, Ad infinitum (almost) managed to produce the works they did without DxO? :-)

I always imagined that to buck was somehow derived from a horse bucking a "bucking bronco"

That might be a bit new-world centric, or at least Places-where-saddles-have-pommels centric.

I wonder what the DxO scores for the cameras that shot the evenly illuminated, richly detailed vintage images at shorpy.com would be? (yes, I know it couldn't be done - at least not "right")

Makes *me* wonder how valuable those scores are in a majority of imaging scenarios. I'm thinking, "not really that much".


Well, we have finally reached that day we all predicted would come eventually. The day when someone actually wrote an article comparing two cameras, and all they used was data from DxO.

Looking at that data, one can only conclude that the actual hands-on comparisons of the two cameras in the real world that have been done around the web - with real photographers taking real photographs and then looking at the results on screen, and in some cases in print - must be incorrect, because they don't verify the DxO results. Obviously, those real photographers don't understand what they are doing or what they are seeing.

Unlike, of course, everyone who who looks at the graphic results of DxO's proprietary algorithms and methods with which they measure.... what is it they actually measure specifically again?

And what the real world results show is that, contrary to expectations based on pixel count, DxO rankings, or astrological signs, the IQ of the two cameras is very very close. And that, in the opinions of many reviewers, it would be other factors of these camera's operation that might be more important distinctions for photographers.

Obviously, Canon needs to improve the performance of their sensors. If only to score higher on the DxO tests.

I think you're guilty of extreme misreadings in about five different ways. The article wasn't comparing two cameras; DxOMark doesn't review cameras at all--only sensors; DxOMark doesn't even try to measure practical image quality--it seeks to quantify the theoretical imaging potential of a given sensor; and they certainly weren't surveying what reviewers think or the results of users. Finally, sensor quality ≠ image quality ≠ picture quality. If I were you I'd heave that giant chip off your shoulder before it breaks your back!


Dear Ed and Roger,

One more time...

the DxOMark tests are the modern equivalent of the technical film tests of yore. Nothing more or less than that. The sort of tests that produced those technical data sheets from Kodak, Fuji, Agfa and Ilford beloved by some photographers... and ignored by many.

You never have to look at a Kodak data sheet to make great photos. Many great photos were made on films far inferior to current films.

That does not make such data sheets nor film improvements pointless.

If such technical tests are of no value to you, then ignore them. But, please don't try to argue they are of no value, period. That's an argument you can't possibly win.

pax / Ctein

I find it a bit ironic that Guichard argues that digital's dynamic range is really better than film's because you need some minimal signal-to-noise ratio (which he picks as 20dB) for the dynamic range to be useful. So what threshold does DxO actually use for their "landscape" (dynamic range) score? An un-useful engineering-style 0dB signal-to-noise ratio!

Funny how things go in cycles. First megapixels ruled and to speak of pixel quality was heresy. Today the wisdom is reversed and megapixels are out of fashion.. for now. I wonder if the same will happen with depth of field. Razor thin DoF being de rigueur, brought on by large sensors in photography and now following through to professional video.. will large DoF buck that trend any time soon?

Dear Mike and Ctein

Do you remember when the DxO testing started?

Everyone was on board with the fact that while the results were interesting, somewhat mysterious, definitely useful and somewhat opaque (based on the fact that there were some proprietary (unexplained) analyses going), we all understood that DxO was testing the isolated sensor, and that this was *not* necessarily absolutely correlated with camera performance.

Indeed, there were questions about exactly what was being measured - pixel level?, sensor level?, etc. And there are still questions:

If I recall correctly, the new 5DIII sensor scores *lower* than the Canon s95 sensor on what appears at first blush to be a very important measurement - I think it was one of the dynamic range measurements.

Think about that. And what that tells us about exactly how much we understand about what DxO is measuring. Because the measurement was probably correct(!) (How deep a discussion about pixel level information would be entailed to figure out why that measurement would be correct *and* and the same time not tell us much about the actual dynamic range performance of the 5DIII sensor?) More importantly, what does that tell us about what the average digital photographer is going to come away with?

One thing that we all agreed upon at the beginning of the DxO measurements was that they needed to be put into perspective. And that is EXACTLY what this article fails to do. My problem is with the article, NOT with the DxO testing. So, Ctein, when you say:

"If such technical tests are of no value to you, then ignore them. But, please don't try to argue they are of no value, period. That's an argument you can't possibly win."

I agree - that's not an argument, however, that I am trying to make.

The article is only about comparing two *cameras* and only about their DxO measurements. Yes - cameras. (Yes - they point out they are talking about the sensors, but this is not used consistently and this is not the impression a less than *very* observant reader would come away with.)

Cameras, not sensors, is what they have photographs of, and that is what the article uses as vernacular ( "The D800" vs "The 5DII"). That is the only impression that the average CNET reader - this is for CNET, remember, not some specialized digital photography technical forum - is going to walk away with.

This article is EXACTLY what we all said we hoped we would never see when we first were introduced to the DxO ratings. And here it is. And you guys are defending it, as if it is doing anyone a favor.

Mike - you said it yourself:

"...DxOMark doesn't review cameras at all--only sensors; DxOMark doesn't even try to measure practical image quality...they certainly weren't surveying what reviewers think or the results of users. Finally, sensor quality ≠ image quality ≠ picture quality."

All true! And all exactly the problem! Since this is *not* a true comparison of image quality, one has to ask - why publish the article? There is no way that 99% of the readers of CNET are going to look at that article, and come away with an impression other than the D800 is a much better imager than the 5DIII, indeed, that the D800 is a much better camera than the 5DIII.

And that is a disservice to CNET readers.

"The article is only about comparing two *cameras* and only about their DxO measurements."

What article are you talking about? The main article linked in this post, Stephen Shankland's "How DxO Labs tests hot cameras like Canon's latest SLR" on CNET, is NOT "only about comparing two cameras." It hardly compares the Canon 5D Mark III to the Nikon D800 at all. The article recounts a visit to DxO Labs and talks about its origins, its personnel, and its testing methods. Or are you somehow talking about some other article, or two other cameras??


Dear Roger,

There's an unavoidable linguistic confusion. It is both impossible and useless to test the bare sensor. What you (the photographer) really get to work with is the sensor embedded in a camera whose electronic circuitry massages the sensor output in all sorts of ways before it hands you RAW data. Really, any “sensor” test is a test of the camera containing that sensor. And that will lead to some confusion.

The thing to remember is to read in context. When DxOMark tests a “camera”, they are only running the equivalent of a film test. They are not testing any of the other myriad camera aspects that enter into our choice of what camera to buy and why. Really, you just have to take that for granted, unless you want them to insert a massive expository lump every time they dare to use the word “camera.”

Honestly, if the reader can't figure that out from all that's been written, I think that's their problem. You can only hold people's hands so much, and at some point they have to take responsibility for engaging their brain and figuring out what is actually meant instead of trying to interpret words purely literally and out of context.

I'm not “defending” the CNET article. I don't care enough about it to defend it. I'm only saying that I think you are making a mountain out of a molehill here. If some reader is so unknowledgeable and shallow in their thinking that they come away from an article like that convinced that they should buy a Nikon over a Canon, do you really think they'll be getting a bad camera or, more importantly, that they will wind up unhappy with their choice? I sincerely doubt it.

pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
-- Ctein's Online Gallery http://ctein.com 
-- Digital Restorations http://photo-repair.com 

Dear David,

You've brought up something important here, which is there is no one right way to characterize a measurement. Thanks! This is a subject worth talking about.

To give some pertinent parallels in silver halide measurements, consider a printing paper's contrast grade. This is derived from the “exposure range” of the paper, but that range isn't the total luminance range the paper can record, it's the luminance range that produces tones between 10% darker than minimum white and 90% of maximum black. There are some good practical reasons for doing that; the toe and shoulder shapes of the characteristic curves can vary hugely between papers while the midrange contrast stays largely the same. Also, tonal discrimination gets pretty lousy at the extrema. This is much like the threshold that Guichard is talking about.

Similarly, film ISO was (originally) based upon the exposure needed to produce a density of 0.2 above base plus fog. Again, a somewhat arbitrary threshold based upon the point at which you start to see good tonal discrimination.

At the same time, what's happening at the extremes that fall outside of those measurement ranges is really important. A long-toed film records some detail much deeper into the shadows, although with poor contrast. Many professional photographers were convinced that Vericolor was really slower than the ISO Kodak assigned to it. That wasn't the case. What you saw, if you look at the full characteristic curves in the data sheets, was that Vericolor had a very short toe. Which meant that below that 0.2 level, it maintained good contrast in total separation (valuable in things like formal portraits), but it hit the floor faster than most other films.

Similarly, the real toes and shoulders in the characteristic curves for print papers have a great deal to do with their “look.” In fact, more often than not, matching the right paper extrema to the black-and-white film you're using is what makes or breaks a fine black and white print. Lots of black-and-white printers favored papers with relatively short toes, because they were used to using films that had substantial rolloff in their shoulders. The short-toed paper put snap and sparkle back in the highlights. Then the TMAX films came along, with shoulders that would maintain their contrast and keep on going into the stratosphere, and they printed terribly on those papers; they looked good with a paper with a long toe.

The same conceptual problems exist characterizing sensors. Which is more useful, a truncated “practical” exposure range or the full exposure range that shows you everything the sensor can squeeze out? Me, I think they're both valuable, and I would love it if DxOMark would choose to publish exposure vs. signal curves like the one in that article, although it would be overkill for 99.9% of the readership. Maybe it's something that's in the works and just not ready for rollout yet. Standards and testing is always work in progress.

Another problem is industry-wide standards, which the article touched upon. People have to agree upon a measurement system. It could very well be that Guichard's remark was essentially a "political" one –– there may be a raging argument going on within the standards groups over whether to continue to use the technically correct but less practically useful zero dB threshold or set it to something higher.

pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
-- Ctein's Online Gallery http://ctein.com 
-- Digital Restorations http://photo-repair.com 

Dear Steve,

I’m always confused what exactly comments such as yours are supposed to tell me. The technical aspect of photography does exist and I would be surprised if the photographers you mentioned weren’t on occasion discussing technical matters of the equipment they were using.

I mean, Ansel Adams published a 3 volume book about the technical side of photography, did’t he?

Take care,

Hi Mike

The article which got my knickers in a twist was the second one you listed: http://news.cnet.com/8301-17938_105-57415773-1/canon-5d-mark-iii-underwhelms-on-sensor-test/

Hi Ctein

I'm just rankled to see an entire article devoted only to the difference between two camera sensors as measured by DxO. This is *not*, IMO, the way to use this data or to present a comparison, because the results may not measure a significant difference between the systems in the field.

This article is precisely what we all feared, when DxO testing first hit the scene, would be the next illogical step to be taken when data like this was generated - that we would be treated to comparos based on isolated DxO data, not actual camera performance. Well, that day has certainly arrived.

That said, don't get me wrong, I appreciate the DxO testing and ratings, and I actually believe that in this case they *have* identified a clinically-significant (dynamic range) advantage of the D800 sensor over that of the Canon 5DIII (demonstrated, as it turns out, elsewhere by real use in the field with actual photographs and highlight and shadow recovery in post processing.

The comments to this entry are closed.



Blog powered by Typepad
Member since 06/2007