Introduction: The foregoing discussion of a possible B&W-only camera in these environs (in this post and then this one) has deliberately ignored an issue that is nevertheless of some concern...namely, implementation. How good could a B&W-only sensor be? Would it have any appreciable advantages over conversions from a conventional color sensor?
My observation has been that one potential problem of calling for a niche device is that the first manifestation of the device type can be taken as the litmus test of the whole concept, when its acceptance or rejection by the market might in fact depend far more on the implementation of that idea in that specific device. One thing I took for granted in the foregoing posts on this topic is that I'm assuming that a B&W-only sensor would provide pleasing B&W output! But that's not necessarily a given—I could see a B&W-only camera coming along that I would hate simply because the output didn't look good. So how hard would it be to get good B&W out of a dedicated digital sensor? Since much of this is theoretical to me, and my comments mostly speculative, I thought I'd ask Ctein to join me in talking it over.
Mike: Ctein, there are a number of issues that we can discuss in turn. But to begin with, it's often been repeated on the web that there are probably some inherent advantages to be gained in discarding the color filters on the photosites and the anti-aliasing filters common to Bayer-array sensors. The assumptions are a) that a dedicated B&W-only sensor would be sharper than a Bayer sensor with the same number of pixels, and b) that it would have greater light sensitivity because those filters cut down the amount of light reaching the photosites. Yet the few direct experiments that are available on the web seem to conclude, generally, that while these effects are real, they're not as decisive as some might wish. What are your thoughts?
Ctein: Mike, I'm a little reluctant to get dogmatic about the effect of getting rid of the color filters, because I've never played with a camera with the filters removed nor seen a particularly good test of such. But I've got some notions and I'll get to them in a moment.
The anti-aliasing filter is a little easier to talk about. Getting rid of it produces a modest improvement in resolution, and I emphasize modest. When one reads the resolution numbers online, they look hugely impressive: improvements of hundreds of lines of resolution across the field. Except the total resolution is thousands of lines. A difference between, say, 2100 and 2300 lines of resolution is essentially invisible in real-world photography. Differences smaller than 15% are truly ignorable; personally, I wouldn't pay attention to a difference of less than 25% in practice. So, it's not world-changing.
Getting rid of the Bayer filters should have a much bigger effect. In theory. In practice it seems to be nowhere as big as that. Some of the reasons I understand. In an ideal world, a Bayer-free camera would produce roughly twice the number of resolved pixels, based on the performance of current cameras and conversion software. In the real world, that's only true if the lens is so much better than the sensor that it contributes essentially no unsharpness to the image. If the lens is already a pretty good match for the sensor, you get less than half that benefit. It depends on the lens. Furthermore, other sources of unsharpness, in particular focus accuracy, become more important. So, if I have to hazard a guess (emphasizing that I have not actually tested this), I would guess that stripping off the Bayer filter and the anti-aliasing filter probably improves linear resolution by 25–30%. Which is not chopped liver, but it's not as radical as people might guess.
Improvement in light sensitivity isn't as big as people think because the peak transmission through the color filters in the Bayer array is already pretty good. Mostly what you get is the difference between them being perfect filters that transmit 100% of the light in the wavelengths you want and 0% everywhere else, and what they actually do. Ought to gain you a good half a stop, maybe as much as a stop. Nice. Again, not radical.
Mike: And then there's the problem of aesthetic appearance. People—well, let's be honest, digital photographers—often talk about "the look of film" and say things like "you want digital B&W to look like film B&W, and it's just not going to happen. Neither one is better or worse, they're just different." Well, I beg to differ on both counts. For one, really good digital B&W is rare; most digital B&W looks anywhere from tolerable to horrible. Then again, I was pretty picky with film B&W too—the majority of it wasn't good. Secondly, as you and I and most film B&W photographers well know, there is no such thing as one single look for "film black and white." You can gets all kinds of looks out of film and paper—the film curve and the paper curve interact with each other, and the film developer and how its used affects the film curve, plus there are lots of secondary characteristics such as film grain and paper color (both image color and paper base color). The late Phil Davis codified this sensitometrically in a way that was comprehensible to photographers in his BTZS writings and his Plotter/Matcher computer program. The shorthand term was "F-D-P," for film/developer/paper.
Similarly, there is no such thing as a single look for digital B&W either. But to my eye, the single worst aesthetic flaw of digital is highlight clipping. It often looks bad in color, but it's ten times worse in black and white. And you almost can't escape it, because even if you avoid large areas with no information there are often small areas in the pictures where one or more channel falls off the histogram. F-D-P often rolls off the highlights gently, such that you can almost always get a little information out of even heavily overexposed areas—and the transition to paper white is gradual and gentle—whereas digital tends to shelve abruptly in a particularly ugly way. I, you, Oren and Carl can't quite come to an agreement about dynamic range when it comes to film vs. digital, but even if we concede that digital has adequate dynamic range, digital still has that clipping problem. As I say, it's not so bad in color, because color takes over much of the informational duty in a color photograph; but in a B&W photograph, much of the technical charm and beauty is contained in the delicate highlight gradations, which all too often just aren't there in digital.
Ctein: Not unexpectedly, this is where you and I start to part company. With the better digital cameras, which these days offer a 12+ stop exposure range, I will continue to argue that the problems that most black and white photographers run into is that they're not internalizing how the medium works. Black-and-white digital isn't like black-and-white film; it's like slides. You have to expose for the highlights. That's what I do. Yes, it causes a modest increase in overall noise, but that's nothing I can't largely compensate for in post-exposure processing, if it's actually objectionable (and as often as not, it's not). This whole "expose as far right as possible" business is crap. Noise isn't a dominant problem in digital photography any more; blown highlights still are.
The other big mistake is using the default curves in raw conversion, which are close to straight lines. A film with that characteristic curve looks lousy, too. Think of all the complaints people made about the T-MAX films and their bulletproof highlights before they figured out how to tame that characteristic curve (which left to its own devices was a straight line soaring into the stratosphere). Put a nice shoulder on your raw converter curve; you'll be a lot happier.
Now, I will admit that I haven't done gobs of black and white digital work, but I've done enough to add a half dozen such pieces to my portfolio, and nobody's looked at them and jumped on them for blown-out highlights. Mind you, there may be lots of other reasons why they don't like them [smile].
Of course, to some extent this does boil down to taste and sensitivities. Opposite to you, I'm actually bothered even more by hard, blown highlights in my color photographs. They offend my eye even worse.
Mike: Well, there are some problems with the Curves control in Photoshop it seems to me...you're limited by the hard boundaries of the x-axis and can't construct a gently tapering shoulder or toe without changing the slope of the straight-line section. And in the event I do see a lot of photographers doing what you suggest, and underexposing to avoid highlight clipping. If they don't then fix the curve, the results are rather alarming, with the middle values depressed like dilute Rodinal used to do but to a more extreme degree. Quite wrong to my eye. Again, maybe not so much in color, because as you imply we're more used to the look from slide films, though I never liked it. But in B&W, it very much doesn't do it for me.
I wish I could provide visual illustrations here, but as always, it seems like it would be very unfair of me to find pictures by strangers on the web and hold them up here as examples of what not to do.
I remain skeptical not that the camera engineers have the technology to construct a pleasing B&W sensor but that they will have the pictorial taste to do so. I sometimes speculate whether some special technology could effect the fix, such as, say, a two-layer Foveon arrangement or some application of Fuji's dual-photosite sensor (I'm forgetting what it's called). But I'm not a sensor engineer so this is really just musing on my part.
But now tell me—would optical color filters on a luminance-only sensor work the same way as they do with film, to influence the spectral sensitivity? I might be thick but I really have no idea. Originally filters were meant to correct the orthochromatic tendencies of early films and make them more uniformly panchromatic, but of course photographers soon enough learned to emphasize certain spectral sensitivities for pictorial effect as another tool in the toolkit. In my early days shooting Plus-X and the much-lamented Verichrome Pan (a favorite film, not despite but because of its old-fashioned look—alas, the papers for it are gone now too), I left a K2 (medium yellow) filter glued to every lens I had. Loved that look.
Ctein: I don't know what you mean by "can't construct a gently tapering shoulder or toe." I do it all the time. Yes, there are hard stops at the ends...but there are with darkroom print papers, also. The toe and shoulders don't stretch out to infinity; there are distinct exposure levels that produce D-min and D-max. The trick is that the characteristic curve sneaks up on them with low enough contrast in those regions that you don't notice when you've hit the stop.
Be that as it may, we're wandering off into a hyper-technical discussion that we agree we don't need to be having now. I'll turn this subject into a column for next week.
Back on topic, simply underexposing and not making the curves correction is going to look lousy. That's a failure of technique on the part of some photographers, not the medium. It would be like someone reading Adams or Davis just enough to get the idea of altering their exposure to match the subject luminances and then not modifying their development, printing technique, or choice of print papers accordingly. Of course it's bad; to put that on the medium is blaming the tools not the craftsperson.
By the way, that Alpha 900 you bought is no slouch. It has a 12-1/2 stop exposure range. A Phase One back has a 13-stop range and the Fuji camera you alluded to has a 13-1/2 stop range. 12 1/2 stops is a huge amount. I really doubt you had that much to work with with black-and-white film unless you were routinely doing N–3 development or using extreme compensating developers.
Anyway, I think we've beaten this to death. I'll flog it some more on my own next week [smile].
Yes, optical lens filters work fine with a sensor, just as they do with film. Mind you, you may need a different filter to get the same visual effect—although both the black and white film and the sensor are panchromatic, it doesn't mean they weight different parts of the spectrum the same way. For example, a 25A filter on a monochrome sensor may not produce exactly the same visual result as that same filter used with film. But some filter will.
This problem exists to a lesser degree with black-and-white films; there can be a stop difference in relative spectral sensitivity between different films. Probably worth mentioning that one of the reasons you liked the look of a medium yellow filter on film is that very crudely it made the film's panchromatic response more similar to what you saw with your eye.
Note that simulating on-lens filters in Photoshop is much cruder than using a real filter on the camera. Physical filters can have fairly sharp cut-offs and precisely shaped spectral distributions that are valuable to the photographer. The filter simulations in Photoshop can only mix different amounts of the broadly-filtered color channels in the camera file.
Mike: Shh, you'll upset those who love the "post" processing nature of the Photoshop "filtering"—the loss of which is a standard argument against a B&W-only sensor. I don't know that they're right. It's fun, true, but in practice you don't need an infinite slider of filter colors, and I'd rather have the bump in sensitivity with an unfiltered chip and then add optical filters as I please. To your larger point, that it's the craftsman and not the tool, I have to register modest disagreement in passing before we leave that subject. Because of course you're talking about workarounds, and what I'm advocating is a purpose-built option that doesn't require workarounds—a tool that doesn't require somersaults and cartwheels on the part of its user. One that works well for everybody right out of the box, without special expertise. But as you say, enough of that; we have a couple more topics to cover.
One of them is chroma noise in the shadows. I'm unsure how much of this is inherent in the photosite with low exposure levels plus amplification and how much is a byproduct of the color conversion algorithms. But I think it's well accepted that you gain something in the noise department when working with conversions, because color noise of the same or similar value becomes invisible or near-invisible when it's converted from colors to tones of gray. The question is, would a purpose-built B&W sensor have better low-value noise characteristics still than a Bayer array converted? I think it's possible, although, again, I don't know. I do know that when I experimented with B&W with the D700, I was able to tolerate B&W conversions from files that were unacceptable in color because of the noise. I didn't even particularly like noise reduction on those files, because the slight "grain" gave the image a little bite, and looked nice.
Ctein: Don't get me wrong, I think in-Photoshop color filtering is better than no filtering at all, just not as good as on-lens filtering.
I have to say that for me all photography is about workarounds. From day one in the darkroom, I've never used a medium or material that acted perfectly in accord with my preferences. And I truly don't think the level of workarounds I'm talking about are jumping through major hoops, unless you consider something like the darkroom dictum of "expose for the shadows and develop for the highlights" to be jumping through hoops.
No somersaults, no cartwheels, just simple straightforward sensible practice that's actually in accord with the medium's characteristics. Have we ever demanded any less of decent craft in photography?
Moving on...I totally agree that noise is a lot less bothersome when it's less colorful. In fact, when I'm faced with noisy shadows in a file, whether from a digital photograph or scan of one of my negatives, I'll usually apply a very hefty dose of noise reduction in the chroma channel only. Come to think of it, I usually do a lot more chroma noise reduction than luminance noise reduction. It gets rid of a lot more of what I find to be the annoying cruft. Similarly, I can tolerate 1–2 stops higher speed settings in my cameras if I'm doing a black-and-white conversion.
I strongly agree that you're right on this; noise will be less bothersome. But how much of that will be inherent in the sensor and how much will be the simple result of being in monochrome, I don't know.
Mike: It stands to reason that color Bayer arrays have had the advantage of several decades of continuous improvement and thousands of iterations with tons of R&D money thrown at them. A first-attempt modern B&W-only sensor would be a baby-step by comparison, with only a few modest antecedents. Give a B&W-only sensor a few generations and the benefit of some engineering refinement, and it might start to separate itself from its color counterparts for its intended purpose. Nevertheless, any actual product will have to sink or swim on its own; it would have to compare favorably right out of the box with Bayer arrays converted, or the Conventional Wisdom would slam the concept entirely. Not entirely fair, but that situation is what it is.
We'll have the philosophical argument of purism vs. workarounds some other time. As you have been known to say, put down the can opener and step away from the worms....
Ctein: I don't think you're right that Bayer array sensors have any development advantage over monochrome ones. I can't think of a useful way to make a monochrome sensor that wouldn't embody the same sensor technology that is in the current Bayer array sensors. It's actually easier to make a monochrome sensor; you fab things exactly the same way as you would if you were going to make a Bayer and then you don't put the filter layers on top.
On the other hand, I can imagine optimizations being handled differently. A monochrome sensor gets you more inherent resolution than a Bayer array sensor; maybe you don't take advantage of that. That is, your monochrome version of a 16–20 megapixel camera might use a 10–12 megapixel sensor. Same effective final resolution in the images, but bigger pixels mean more light sensitivity. I suspect you can do similar trade-offs to advance exposure range. But those are tweaks and refinements. Fundamentally, out of the gate, a monochrome sensor can be fairly compared to a Bayer array sensor, I think.
Mike: Well, I'm glad you think so. One less thing to worry about, if such a thing ever comes along. Or let's be optimistic and say "when."
We're getting a bit long now, so maybe we should mention marketing. I'm disappointed, of course, that Leica took a pass on a monochrome camera around the time of the introduction of the M8; but maybe that's a good thing, because as a prestige brand and priced like a Veblen good, maybe a Leica monochrome digital would have sold poorly based on likely selling price alone and discouraged other companies from trying the idea. I'd think the ideal situation would be for a company with big resources like Canon or Nikon to make a simplified version of one of their mid- or even entry-level DSLRs; I've mentioned several times the appeal (to me anyway) of a B&W version of something like the little T3i. But that might be thinking the wrong way—of all the camera companies right now, Canon seems the one that's happiest hunkering right where it is, which is another way of saying it just doesn't seem much interested in innovation at this point, beyond the main currents of development where the terms of competition are clear. Sony's being much more adventuresome. Maybe a mirrorless company like Olympus or Panasonic will be the first. I'd buy a monochrome version of something like the GF1 in a flat second, I have to admit.
Which brings me to probably the most out-there statement I'll make: I think it would sell well, at least in the context of a niche product, as long as it doesn't come with a price premium (or too much of a price premium). I have absolutely no basis for saying this except my constant daily immersion in the zeitgeist of the hobby, and yet I feel quite confident saying it. Maybe that's just optimism. But I don't think so.
I do admit that the selling points would have to be technical. But just doing some monkey's-butt educated guesses, I'd think the following might not be outside of the realm of possibility: A 10- to 12MP sensor with the sharpness and resolution of a 16–20MP Bayer array but a 1- to 2-stop advantage in high ISO—that's taking into account both the technical advantages of a larger pixel-pitch and also the perceptual advantages of lower perceived noise. The necessity to use optical filters would be one cost, possible better DR one advantage, and the handling of highlights would be the biggest question mark according to me and not as much of an issue according to you. Have I got that in the ballpark? Don't let me put words into your mouth.
Ctein: Well, I think you're more of a market optimist than I, but I don't think it's beyond the realm of possibility. Most likely such a camera would have an IR-blocking, panchro-balancing filter built in, so external filters wouldn't be necessary, just a desirable aesthetic tool, as with film. I'm not quite as sure of the speed gain with that. But, overall, no big disagreements.
Mike: Nice talking to you Ctein. I hope one day we'll get to run real-world trials on an actual product and find out. Thanks.
Note: Links in this post may be to our affiliates; sales through affiliate links may benefit this site. More...
Original contents copyright 2011 by Michael C. Johnston and/or the bylined author. All Rights Reserved.
Featured Comment by Tim F: "It strikes me as odd to minimize the potential advantages of black-and-white sensors. I do research science for a living. On a daily basis I drive two microscopes that each cost multiples of what my house cost (the 'bigger' scope costs about 4x). (Both are made by Nikon, for what it's worth. I would not say that my beloved and missed FE2 had no influence at all on that decision). In imaging research nobody would bother to discuss the question. It is just known that color CCD arrays hold a tiny, dim candle to any monochrome array of equal size. A quick side-by-side comparison is all that it takes to convince a new scientist to skip cheap, color toys and build a color image with filters instead.
"The simple reason is that nobody takes advantage of losing the Bayer array by increasing pixels. Most people doing serious microscopy use 1024x1024 arrays (or else 512^2 for seriously dim or fast imaging). Sort of like Ctein's point, lens resolution becomes a limiting factor rather than pixel res after a reasonable amount of magnification.
"CCD makers for research improve monochrome sensors by making the pixels bigger, just like Olympus would do if they made a B&W Pen. It would have 12 million pixels, each of which has about 4x more area than those in my E-P1, and the low light capability would be mind-boggling. Or they could have about 20 million pixels about twice as large and the low light ability would merely be jaw-dropping. You could take pics at a bar that look like midday, though they would need to offer a ND filter set for people who want to sync a strobe in daylight.
"As to whether it will happen, no. Can you convince a manufacturer that the number of people ready to spend over $1,000 on the absolute minimum shell necessary to carry this chip and manual controls would even fill the room at a press conference? The economy of scale is not with you here.
"If they do make one, though, I would seriously think about picking one up on eBay."