In response to Mike's post "The End of Cameras?", psu posted that "A tablet based digital view camera could be a cool thing."
I quickly thought of several ways that one could actually build such a device. Three approaches use off-the-shelf stuff available today, the fourth is going to have to wait for some technology I wrote about in one of my "futurist" columns. In any case, I will throw them out for your edification and amusement.
Since I like the form and I really like the screen in the iPad, I'm going to put it in an iPad case and screen. Okay, I might have to add a millimeter or three to the thickness, but maybe not. I really think I can keep the form factor.
The whole trick to making this work is getting around the problem of the long lens- sensor distance in a view camera. I mean, it's not a very tough problem if you're willing to let your iPad be 6 inches thick, but doesn't that kind of defeat the purpose? So how do you get really short lens-sensor distances? You use really short focal length lenses. But, then, how do you get the coverage for a view camera type sensor?
The answer? You don't even try.
Integrated camera modules are getting really, really small, a few millimeters on a side. The very latest ones are truly integrated structures; there is no separate assembly of the sensor with the lens; everything, chips and lens elements alike, is fabricated, micro-machined, aligned, and stacked in a fab line very similar to what is used to make chips. With no separate mechanical housing and alignment jigs needed, these integrated cameras can be made very accurately and precisely, with surprisingly good optics. High-quality ƒ/4 lenses are trivial; even high-quality ƒ/2 lenses aren't hard. From the view camera perspective, these are ultrafast optics.
To turn this into a view camera you turn to image synthesis or computational photography. On the backside of the iPad, we embed, oh, say, a 10 x 10 cm array of integrated camera modules. That's several hundred of these little cameras. Each camera makes its own photograph. Each photograph, in and of itself, is nothing to write home about. Throw them all into a data processing stream, and it's a whole 'nother story.
By themselves, integrated camera modules don't have very good exposure ranges or noise characteristics. Synthetic imaging can drastically improve both. For example you could put 4–5 stop neutral density filters over alternating cameras in the array. Another thing you can do to markedly improve picture quality is to reduce the pixel count. Remember, we have hundreds of these sensors, they can be a few hundred kilopixels each, with individual pixel sizes comparable to what you'd get in good professional cameras. You're still getting out 50 megapixels or more at the end.
At this point there are couple of ways you can go; at this conceptual level I can't say which is the best:
- Take a page from the Gigapan camera rig. Give each camera in the array a long lens, so it only captures a small part of the field. Mount them so each camera captures a different part of the field of view, with plenty of overlap of course. Then stitch. Mechanically this may be the trickiest, but computationally it's the simplest.
- Mechanically simplest is to give each camera a full field of view and then combine the images from all of them to produce what is called a "superresolving" image; this was used on Mars to produce photographs with 2–3 times higher resolution than the cameras were inherently capable of. The idea is that when you make repeated photographs of the same scene you don't get perfect alignment of images from frame to frame. By doing differential calculations on the frames, you can extract sub-pixel resolution. It's a well understood technique. In this case, you're applying the method to several hundred photographs. Each of them is not of very high quality, but the synthesized image is.
- Go the full-blown computational photography approach, à la what Adobe and others have been investigating. Each individual camera doesn't so much collect a usable image, or piece of one, as collect data which can be integrated and synthesized to produce a high quality photograph. Ultimately this gets you the most image quality and flexibility from the hardware, but the computation requirements are formidable.
Any of these approaches require many gigaflops of data massaging; computational photography can push into teraflops without breathing hard. It's not a question of whether you could build that much computing power into an iPad case, it's whether you could do it without the case getting uncomfortably warm to the touch and your battery life going from hours to minutes. Time will solve that problem, but in the meantime, an interim approach would be for the iPad view camera to only display a "draft" rendering of the finished photograph. A dedicated GPU or DSP can pump out 6 GFLOP/sec/watt, so draft rendering would be well within the capabilities you could build into a tablet camera. To do a proper development job on it, you'd dump your data to a desktop machine and let it crunch away at its leisure. Kind of like Adobe Camera RAW processing on crack.
I am reasonably certain these designs are buildable today. If you threw me some money and R&D team, I could even optimize them and figure out which would be the best way to go. Frankly, I'm not interested in doing that. I just like playing with the ideas. But if one of you 25,000 or so readers actually has the wherewithal to pursue this, be my guest. Take any of these ideas and run with them, and more power to you.
I do request, if you actually come up with a product out of this, that you send me Number 2 off the production line (obviously you will keep Number 1 for yourself). Also, it would be nice if you paid me 1% of the gross sales revenues.
But, really, that's between you and your conscience. You're under no legal obligation, just a karmic one.
Then there's the fourth option. It's still well in the future, but I've written about it before:
Put a big swatch of image-collecting fabric on the backside of the iPad case as your sensor.
Not all clear when we're going to get that one, but the first three? Off the shelf, or close enough for jazz.
Ctein's weekly column appears on TOP on Wednesdays.
Note: Links in this post may be to our affiliates; sales through affiliate links may benefit this site. More...
Original contents copyright 2011 by Michael C. Johnston and/or the bylined author. All Rights Reserved.