My holiday column talked about the near-term improvements we could expect in digital imaging technology. This time, we're going to go far out. There are technological possibilities that are way beyond anything we've got now. Mind you, this won't be "wantum physics," Greg Benford's lovely name for the made-up science that solved problems on the good Starship Enterprise. There will be no funneling of beams of anti-expositrons through a tachyonic memory to get extra-long exposure ranges. (Note to the physics-impaired: the preceding sentence has no fact-based content whatsoever.) Everything I'll describe is already existing science and engineering; it's just so far away from production-level technology that there's no telling when we'll get it or just what form it will be in.
• Full-spectrum-recording sensors—these sensors not only detect individual photons but measure their energies as well. The sensor is panchromatic; there are no colored filters that throw away most of the light. In combination with photon-counting, each pixel could be reporting the amount of light and its spectral distribution. That's huge. Performance is just as good whether you're doing black and white or color, and your color fidelity is now limited solely by how good your models of human color vision are.
• Pixelless sensors—maybe you don't even need physical pixels. There are tricks one can play with excitons and surface plasmons (yes, those are real; no, please don't ask me to explain) that allow one to determine where on a sensor the photon was detected. No need to create physically isolated pixels; the sensor not only reports the arrival of photon, but its coordinates.
Combining all three of these sensor qualities would probably be problematic because each is somewhat antagonistic to the other, for reasons of bandwidth, response times, and signal-to-noise ratios. It's hard to see how current electronics would let you make a full-frame pixelless photon-counting full-spectrum sensor. Attosecond electronics might make that possible, but they're far enough off that I don't want to speculate.
A compromise, though, is feasible. Imagine a camera sensor with relatively large physical pixels; 10x10 or even 20x20 µ in size. Each pixel counts photons individually and determines their energy and position (don't worry, we're not anywhere close to the Heisenberg limit). How that information gets used becomes an artistic and photographic choice.
If you're doing really low light black-and-white work, you could decide to throw away the spectral information, and collectively count all the photons detected in the physical pixel. It's the digital equivalent of loading up your camera with high-speed, coarse-grained black-and-white film. At higher light levels, you might choose to start parsing up those electrons by position; the more light, the more you could computationally subdivide the physical pixel into image pixels without excessive noise. Essentially, you "dial out" your coarse-grained high speed film and dial in finer grained, slower, sharper film. Similarly, color and color fidelity become adjustable artistic parameters.
What you'd get would be a kind of "soft" film; image quality characteristics that were previously fixed in physical film and sensors become variables that you control. Fast or slow, black-and-white or color, fine grained or coarse, these are no longer determined by the physical medium but by your artistic choices.
In closing, remember that Ubergeek technology is not inherently in conflict with the traditional craft and art. They can nicely go hand-in-hand. Consider the sheep of things to come...