My holiday column talked about the near-term improvements we could expect in digital imaging technology. This time, we're going to go far out. There are technological possibilities that are way beyond anything we've got now. Mind you, this won't be "wantum physics," Greg Benford's lovely name for the made-up science that solved problems on the good Starship Enterprise. There will be no funneling of beams of anti-expositrons through a tachyonic memory to get extra-long exposure ranges. (Note to the physics-impaired: the preceding sentence has no fact-based content whatsoever.) Everything I'll describe is already existing science and engineering; it's just so far away from production-level technology that there's no telling when we'll get it or just what form it will be in.
• Full-spectrum-recording sensors—these sensors not only detect individual photons but measure their energies as well. The sensor is panchromatic; there are no colored filters that throw away most of the light. In combination with photon-counting, each pixel could be reporting the amount of light and its spectral distribution. That's huge. Performance is just as good whether you're doing black and white or color, and your color fidelity is now limited solely by how good your models of human color vision are.
• Pixelless sensors—maybe you don't even need physical pixels. There are tricks one can play with excitons and surface plasmons (yes, those are real; no, please don't ask me to explain) that allow one to determine where on a sensor the photon was detected. No need to create physically isolated pixels; the sensor not only reports the arrival of photon, but its coordinates.
Combining all three of these sensor qualities would probably be problematic because each is somewhat antagonistic to the other, for reasons of bandwidth, response times, and signal-to-noise ratios. It's hard to see how current electronics would let you make a full-frame pixelless photon-counting full-spectrum sensor. Attosecond electronics might make that possible, but they're far enough off that I don't want to speculate.
A compromise, though, is feasible. Imagine a camera sensor with relatively large physical pixels; 10x10 or even 20x20 µ in size. Each pixel counts photons individually and determines their energy and position (don't worry, we're not anywhere close to the Heisenberg limit). How that information gets used becomes an artistic and photographic choice.
If you're doing really low light black-and-white work, you could decide to throw away the spectral information, and collectively count all the photons detected in the physical pixel. It's the digital equivalent of loading up your camera with high-speed, coarse-grained black-and-white film. At higher light levels, you might choose to start parsing up those electrons by position; the more light, the more you could computationally subdivide the physical pixel into image pixels without excessive noise. Essentially, you "dial out" your coarse-grained high speed film and dial in finer grained, slower, sharper film. Similarly, color and color fidelity become adjustable artistic parameters.
What you'd get would be a kind of "soft" film; image quality characteristics that were previously fixed in physical film and sensors become variables that you control. Fast or slow, black-and-white or color, fine grained or coarse, these are no longer determined by the physical medium but by your artistic choices.
In closing, remember that Ubergeek technology is not inherently in conflict with the traditional craft and art. They can nicely go hand-in-hand. Consider the sheep of things to come...
The sheepses are cool. Particularly like the part with Ouverture 1812. :-)
As to the rest, even if we dispense with the handwavium and concentrate on the extrapolated science stuff*, what you're talking about is... wait for it...
Art filters.
If there is a technology that can record individual photons, I guess it will be used for full-visual-experience immersive holo-recordings.
The things like coarse/fine grain film immitations are nothing but a particular choice that would divorce the recording from the reality and make it more arty. Just like the Art Filters try to do now, at a much lower technological level. Hopefully, with the photon recording, there would also come holo memory cubes and quantum computing, so there will be no tedious wait for the camera to process a desired setting.
* Just like we should** with the real science fiction. SF is not just little robots, little ray guns and little spaceships. Curse you, Hollywood!
** That fact that I like smart fantasy more is beside the point. Nobody's perfect.
Posted by: erlik | Monday, 23 March 2009 at 04:02 PM
Not sure what all that stuff meant, but according to the Popular Science/Mechanics rags of my childhood, I was suppose to be going to work in my family hovercraft by now. And I am still waiting for my personal jet pack too. They did, however, come through on the personal computers.
So until tachyonic technology comes available, I guess I'll stick to film.
Eric
Posted by: Eric Mac | Monday, 23 March 2009 at 04:07 PM
All vote now for a Ctein's TOP monthly guide to particle physics and other things we barely understand!
Enlightening as usual - but I'm sure I'm not the only person desperate to know about excitons and plasmons.
Posted by: Charly | Monday, 23 March 2009 at 04:12 PM
Amazing - LED Sheep
Posted by: Riley | Monday, 23 March 2009 at 04:12 PM
Ctein.. I think you have been spending too much time on the internet. Time to get out of the house, breathe some fresh air and take some pictures with whatever technology you have at hand..
Posted by: James | Monday, 23 March 2009 at 05:25 PM
Would #3 resolve the issues digital sensors have with light rays not hitting straight on?
Would the throwing away of information in the combination sensor have to be at the moment of capture, or something done in post?
Posted by: Jason | Monday, 23 March 2009 at 05:51 PM
"Soft" film and the capacity to change the detail/image quality vs. effective ISO trade-off on the fly? The ability to optimize the sensor for color vs. black & white on the fly?
Sign me up!
Sounds to me like such a device would also be far more upgradable by firmware improvements down the road than current digital cameras.
Posted by: Geoff Wittig | Monday, 23 March 2009 at 06:31 PM
I knew Ctein had written this before I reached the end of the first sentence.
Good read though.
Mike
Posted by: Mike | Monday, 23 March 2009 at 06:41 PM
While you are recording the arrival of individual photons, how about getting vector information as well as energy information?
Then you could get rid of those problematic lenses and do it all in software.
Re plasmons:
this is fun http://www.abc.net.au/science/articles/2003/06/12/872457.htm
Posted by: hugh crawford | Monday, 23 March 2009 at 06:41 PM
"There will be no funneling of beams of anti-expositrons through a tachyonic memory to get extra-long exposure ranges."
You joke, but that's not too far off from reality. In the fiber-optic communications world, we have Erbium-doped fiber and Raman amplification. Both are quite close to your fake technology.
Ken
Posted by: Ken N | Monday, 23 March 2009 at 06:51 PM
So far most of the comments seem to have missed a critical point. This is about way more than arcane science and "wantum" dreaming. Current digital photographic technology is heading straight for a wall ... a very BIG wall. As more and more pixels get packed into the same size sensors we already know about the "noise" problems it creates. We can already see the backlash against that with the lunacy of the megapixel race that's been going on in the industry in the last ten years. But the upper limiting factor in not signal noise its the size of a photon - you know, those weird little scientific whatsits that actually make up that stuff we call "light". As the size of the pixels get smaller and smaller sensor manufacturers are forced to stuff photons down a smaller and smaller hole - thus the need for higher amplification and greater signal noise in the data stream (OK,OK,. That's a simplification, but you get the idea ... don't you?). If you follow that road to its logical conclusion, regardless of what tap dance the manufacturers come up with in the software they use to process sensor data, they will eventually reach a point where the pixel size approaches that of a photon and can shrink a pixel no further. They have hit "THE WALL" as far as pixel density goes.
What Ctein is talking about is a possible way out of this dilema before it arrives. If there are no "pixels" per say in the sensor, the size of a photon becomes irrelevant and you are into a whole different universe of digital imaging that goes way beyond "Art Filters".
PS for James: I looked up exitons and plasmons on Wikipedia. The technical discussion is incomprehensible unless you are a quantum mechanic and don't even look at the math, but the synopsis is half way understandable.
Posted by: John | Monday, 23 March 2009 at 06:58 PM
Ctein, the major point of what you wrote (or maybe what I found more important to me) is that we will be able to decide on the megapixelage of each photograph *after* we've taken it (if shooting RAW).
There are some people who stick with their "old" 6MP DSLRs because they find them to perform better in low light, while some want more pixels to make larger prints of their landscape shots. With the advances you've described we would have a single sensor that could deliver great sensitivity at high ISO (at lower MP counts), or great resolution at low ISO (at higher MP counts).
That's a win-win situation, which makes me believe it's still far away. We're not ready for a win-win situation--the forums would grind to a halt with nothing to complain about.
What am I saying? We'll *always* find something to complain about :-)
Posted by: Miserere | Monday, 23 March 2009 at 07:35 PM
This may ruin my 'lurker' status at TOP, but I can't resist uncloaking long enough to comment that there must be a multitude of androids out there who [that?] have finally had all of their dreams fulfilled...
Posted by: Reed | Monday, 23 March 2009 at 08:07 PM
@James: Perhaps he should just spend more time with sheep instead? Thanks Ctein for making my day, month, year...
Posted by: expiring_frog | Monday, 23 March 2009 at 08:11 PM
Mmm, synchronicity. I read the blog via my Thunderbird feeds so the video link didn't show up. When I read the "sheep" comment I immediately thought of the that video and was about to post the link to it as a comment, but it seems you're way ahead of me...
Matthew
Posted by: Matthew Allen | Monday, 23 March 2009 at 08:25 PM
Oh Reed!! You are killing me. That took me a few seconds. I applaud you.
Posted by: John Willard | Monday, 23 March 2009 at 11:25 PM
Thank you for the great post. About the art filter comment: more than filters I think the post is talking about the use of the sensor and of the data acquired, data that per se wouldn't have any meaning. Choose a representation among mutually excluding possibilities. Like sacrificing resolution for sensitivity, thing possible right now with a pocket fujifilm, I think.
Giorgio
Posted by: Giorgio | Monday, 23 March 2009 at 11:27 PM
James, actually, Ctein spent his time getting a physics degree from Stanford University. It's the rest of us who spend too much time on the internet...
Posted by: Mani Sitaraman | Monday, 23 March 2009 at 11:56 PM
Most people do not realize this, but conventinal photography has had since around 1825 to mature. Digital photography is in its infancy; people are impatient.
I was a whiz at physics, so keep 'em coming.
Posted by: misha | Tuesday, 24 March 2009 at 12:58 AM
Dear Charly,
I'd love to! Problem is that with only three columns a month, I'm already massively backed up on topics I want to write about. Ensuring a monthly column on the physics of photography on the to-do list just ain't feasible.
-------
Dear Reed,
Oh, bravo sir, bravo!!!
-------
Dear Mani,
Actually, it was Caltech. Stanford wouldn't have me.
-------
Dear Matthew,
Finding a way to work that video into my next column was an act of both inspiration and desperation. How could I not?!
-------
Dear Jason,
The sensor is always collecting all the information; I imagine that there would be a "JPEG" mode where these image quality choices are preselected to reduce postprocessing time and storage requirements, but in "RAW" mode you'd get to decide all the stuff at your leisure. The folks who are unhappy now with the size of RAW files are going to be screaming bloody murder...
~ pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
======================================
-- Ctein's Online Gallery http://ctein.com
-- Digital Restorations http://photo-repair.com
======================================
Posted by: ctein | Tuesday, 24 March 2009 at 02:17 AM
Dear Geoff, Hugh & Ken,
In the technologies I'm aware of, there would be a maximum quality in terms of spectral and spatial resolution that couldn't be exceeded no matter how clever the firmware or software got. The precision with which you can detect position and energy is hard-built into the design of the sensor.
Doesn't mean there aren't other technologies out there that might do it differently. I just don't happen to know about them.
In the same vein, I don't know of any sensor technology that allows collecting vector information from photons of visible light energies, but there might be one out there.
I just love nonlinear optics. If I could figure out how to even remotely tie it into a column on photography, I'd write about it. Haven't figured out how, yet. Ya never know; my first holiday column was about the Standard Model.
But, metamaterials are on my shortlist. In fact, they were going to be part of this column, 'cept I ran out of room.
~ pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
======================================
-- Ctein's Online Gallery http://ctein.com
-- Digital Restorations http://photo-repair.com
======================================
Posted by: ctein | Tuesday, 24 March 2009 at 02:35 AM
Ctein,
Just out of curiosity, when you say that these technologies are in the future, do you mean that they are in that mythical 3-5 range that keeps getting pushed back or do you mean something more concrete like the next decade or two?
Posted by: Peter | Tuesday, 24 March 2009 at 08:24 AM
Has anyone actually developed sensors that position photon impacts within the sensor?
Attosecond electronics are not only very far off, but I suspect they may be impossible. It takes too long for an electron to move from one atom to an adjacent atom, closer to a femtosecond. Maybe with non-charge-based electronics, or with a different information carrying particle altogether.
Roughly 6x10^12 photons/second of visible light emitted from the sun impinge on a square millimeter, or 1.2x10^8 photons/second should be hitting a 20µm^2 sensor, at max, which is roughly one photon every 8ns. Easily doable with today's forseeable electronics, even assuming we go to a resolution of 8ps to account for the stochastic nature of the process, ensure we don't have two photons to deal with, etc.
If we go to a 35mm full-frame sensor, that's one photon every 216 attoseconds. Ouch. Even ignoring sunset photograpy, we're still well out of range in the femtosecond region. If multiple excitons can be teased apart and located within a sensor, it might be possible to measure them, but even then we're looking at thousands or millions of excitations to decode in a very short time.
There's the math, for anyone who's curious.
Posted by: Micheal Leuchtenburg | Tuesday, 24 March 2009 at 11:25 AM
It's fun to see these sci-fi looks at digital photography's "future", and sometimes its present, that Ctein occasionally constructs. Some of this stuff may very well end-up on the shelves of tomorrow's Best Buy stores. Fifteen years ago I might have eagerly slurped this up and begged for more.
But now, with all of the camera gear I'll ever need (or even want), I seem to have little interest in the medium itself. My own interests lie solely in images, whether film, digital, paint, inscribed, ....
Posted by: Ken Tanaka | Tuesday, 24 March 2009 at 12:12 PM
Dear Peter,
I mean further off than the stuff I talked about in my holiday column.
You can't put a timetable on stuff this far away from mass-production technology.
pax / Ctein
Posted by: ctein | Tuesday, 24 March 2009 at 12:30 PM
Quantum camera snaps objects it cannot 'see'
"A normal digital camera can take snaps of objects not directly visible to its lens, US researchers have shown. The "ghost imaging" technique could help satellites take snapshots through clouds or smoke....
The new technique also uses a light source to illuminate an object. However, the image is not formed from light that hits the object and bounces back. Instead, the camera collects photons that do not hit the object, but are paired through a quantum effect with others that did."
Full article here.
http://www.newscientist.com/article/dn13825-quantum-camera-snaps-objects-it-cannot-see.html
Posted by: John A. Stovall | Tuesday, 24 March 2009 at 02:52 PM
Here are the 'New Scientist' picks for new photography technologies.
The future of photography
http://www.newscientist.com/article/dn14735-the-future-of-photography.html?full=true
Posted by: John A. Stovall | Tuesday, 24 March 2009 at 02:56 PM
I really have doubts if this is the "Shapes of Some Things Long to Come".
#1 - I see no eventual difference between photon counting and accumulating charge. As with charge accumulation, photon counting will have some error percentage, which will lead to similar signal to noise ratio problems of today. About the fact of it being more sensitive, its again the same thing as working on our current sensors to make them more sensitive.
#2 - This may be possible - just that full spectral information might be just too much data for each pixel site. Foveon type sensors are more practical, and my guess on the path technology will take after it gives up its megapixel obsession.
#3 - Again, there will be some error in those coordinates, which will limit the maximum resolution you can get out of sensor. Though, it will be interesting if 'setting it to less resolution' may be more effective than pixel binning techniques to get better high ISO pictures from the same sensor.
I may be wrong, but thats just my take on it.
Posted by: Aman Gupta | Tuesday, 24 March 2009 at 03:59 PM
There's no point in doing anything that attempts to capture resolution at the individual photon level for several fundamental reasons. Here's just two.
1. If you were able to nail down the coordinates of where a photon landed with any kind of accuracy, you would necessarily have to trade other information about that photon related to its place in the EM spectrum, whether or not it landed during the exposure, etc. (Remember, the Heisenberg Uncertainty Principle doesn't have to be written in terms of location and momentum...it can be in terms of frequency.) So there is a necessary quantum-level trade-off at this level between knowing the position and frequency.
2. If you're talking about a system that includes any noise at all (as when you referred to signal-to-noise ratio), then operating at the level of photons is absolutely always going to be below your noise floor. There's no value in pretending that photons are being counted individually when the error bars on your count reach out orders of magnitude beyond that energy level.
Posted by: SVR Photography | Tuesday, 24 March 2009 at 05:11 PM
Dear Micheal,
Yeah, position-detecting sensors are "already existing science and engineering," but far, far from commercial.
Thanks for the calculation on photon flux! Those are numbers for direct sunlight, right? In which case, they should be factored down for lens transmittance which equals 1/(2f)^2. in other words, for an f/2 lens, there's an eightfold reduction in light; call it an order of magnitude with absorption losses within the lens.
In any case, simple photon counting is well within the realm of existing electronics for modest sensor areas. Position energy sensing may be a different matter. My biggest concern on the position sensing is how fast excitons and surface plasmons propagate, which I don't know off the top of my head. So, like you, I'm concerned about teasing apart overlapping excitations. Should be doable with enough electronics time resolution and computing power, but "enough" could be a rather large number.
The throwaway remark about attosecond electronics was triggered by an article I read in Photonics Spectra a couple of months ago. It was by researchers who are building instruments to take attosecond range data from electronic devices with the goal that they shall need such instruments to develop attosecond electronics. They were obtaining measurements with better than 100 attosecond resolution. They seem to think it was plausible to build electronic devices that operated in that time domain. I took them at their word; I figure they wouldn't be going to all the trouble of developing the instrumentation if they didn't think there wasn't a good reason for doing so. They were definitely interested in the industrial side rather than the pure research side.
Truly mind-boggling.
pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
======================================
-- Ctein's Online Gallery http://ctein.com
-- Digital Restorations http://photo-repair.com
======================================
Posted by: ctein | Wednesday, 25 March 2009 at 06:28 PM
Dear Aman,
Single photon counting is already used in scientific instrumentation and has been for some time. The trick is integrating it into a many multi-pixel sensor. It is in fact considerably more sensitive than current photographic camera sensor designs, and it also has a substantially better signal-to-noise ratio, which is why it's the design choice for extremely sensitive instrumentation.
You're quite right on point #3; there are definite hardware limitations to positional resolution (and for that matter spectral resolution). The practical limits are pretty high , high enough anyway to make it useful for photographic purposes.
If your "conventional" sensor is as efficient at counting photons and if it has a fill factor near unity, you are correct that there isn't any difference between the "pixel-free" sensor and binning data from a conventional sensor.
Fond as I am of the concept of the Foveon sensor, it is not actually a particularly good detection device, and there are serious questions how far the technology can be pushed. While I do think that we will eventually see some kind of "color sensing" pixel replace the Bayer array, I wouldn't even want to try to guess what the specific technology would be. Personally, my hunch is that Foveon will get historical credit, well deserved, as the "proof of principle" device but will turn out to be a technological dead-end.
pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
======================================
-- Ctein's Online Gallery http://ctein.com
-- Digital Restorations http://photo-repair.com
======================================
Posted by: ctein | Wednesday, 25 March 2009 at 06:28 PM
Dear SVR,
1) As I indicated in my post, the product of the uncertainties in the complementary measurements in any practical devices photographers would care about don't come anywhere close to h-bar. Heisenberg simply is not an issue. In a realistic device, you're not likely to be measuring position to better than a large fraction of a micron nor energy to better than 10%. That is well within the safe region.
2) Numerous devices already exist for doing single photon counting with high accuracy and extremely low noise floors. The trick is implementing that capability in a many-megapixel device, but there are no fundamental physical or engineering limits involved. In fact, when you're doing photon counting, there are a number of tricks you can apply that don't work when you're simply collecting charge in aggregate.
For example one design works by applying a potential gradient to the generated photoelectrons to accelerate them to the collection electrode (it can be an avalanche breakdown device, but doesn't have to be). When a highly absorbing sensor is used, so that all the photoelectrons are generated very near the surface of the device, almost all thermoelectron noise can be rejected electronically, because those electrons are generated deeper in the device and don't get accelerated as much before they reach the collection electrode. The spurious electron noise in this device is extremely low.
There are also electronic tricks one can apply downstream to reject noise that rely upon knowing the pulse shape produced by a photoelectron detection. Most noise has a very different pulse shape. Point being that the error bars on photon counting can be driven to a usefully low-level, and the technique has substantially advantages over simple charge collection.
pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
======================================
-- Ctein's Online Gallery http://ctein.com
-- Digital Restorations http://photo-repair.com
======================================
Posted by: ctein | Wednesday, 25 March 2009 at 06:29 PM
Just saw the video (sorry :~() All I can say I OMG! Man am I stupid.
Posted by: Christopher Lane | Wednesday, 25 March 2009 at 07:59 PM
Does this sort of technology depend at all on having a crystalline substrate, or do you think there's any reasonable chance that eventually somone like printed-systems.de could print large pixels that just do the counting and energy measurement onto film? In response to an inquiry I made once, Schneider told me that their large format lenses (that are designed for film) would be happy with ~15 micron pixels. It'd be cool to see the electronics adapt to the lenses for a change.
Posted by: John Banister | Thursday, 26 March 2009 at 11:47 AM
Dear John,
All known technology for doing this requires extremely precise devices (both physically and electronically). That's not compatible with a flexible substrate or with current printing-style electronic fabrication.
pax / Ctein
Posted by: ctein | Thursday, 26 March 2009 at 02:47 PM