« Shootout! Of the eBay Specials | Main | A Sony-Fuji Comparison »

Thursday, 24 September 2020

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

What about this concept?
https://www.dpreview.com/news/7241299090/mft-alice-concept-camera-promises-smartphone-ai-with-interchangeable-lenses

RE realizing its a dream in the dream. I have this sporadically recurring dream motif of witnessing a plane or jet crash. This almost always happens: within the dream I think to myself "You know, I have this dream a lot. Am I dreaming again?" And then in the dream I look around and pay close attention to what's happening and invariably conclude "Nope. This time it's real."

Off topic, I know, but since I’m currently “email impaired” (for SENDING, anyway) I thought your readers would love this, as I do:
Look up twitter babelcolour and find the ASTOUNDING restoration work of Stuart Humphryes, particularly of ancient Autochromes, most before 1920. I found this via someone’s “retweet” and immediately bookmarked it on my iPhone - I check it every day!
Stuart passed 100K followers about a day ago and has since then added over 3,000 more, richly deserved!

Nice shot from that iPhone. I recently purchased a iPad 11 pro. (Late 2018) Every app and photo from my iPhone XR was easily transferred to the iPad near automatically. I was also impressed with how nice and how detailed iPhoto’s look on that sharp retina screen. I built aluminum cages for both machines to tripod mount and attach mics and led light.

Although the iPad looks a bit silly on a tripod the viewing experience is pure amazing. I am even editing photos from cameras via card reader on the iPad. New learning curve for sure but I see changes in the way I may be shooting in the future.

I give it 10 years before I'm going to look as anachronistic carrying my camera in a backpack and setting it up on a tripod as Atget did hauling his glass plates around Paris.

I "dream" (sorry...) of having a "real" camera with all of the features packed into my iPhone. Why can't I have those, too? It would be such a useful set of tools to have; instead, we get camera-gadgets and Lightroom-mobile and all sorts of kludges that make everything way more complicated than it needs to be. I don't need to suffer for my art like that.

I guess Leica came close with the T series, but they seem incapable of developing the kind of software that would make such a camera a breeze to use and seamless.

Fuji have awesome incamera processing. I still remember my first shots from an X100s. Beautiful. I tried to process a raw image to look like the jpeg and couldn't.

Of course, normal people do that. (If I'm normal, that is. Some would question that.)

I dreamed last night in monochrome. I woke up upset at the loss of color, and began the day in a funk. Which lifted. Up until now, I don't think I've ever dreamed in monochrome.

And dreaming of the habit/practice/dependence that you've fought hard to overcome, when I quit my 2-3-pack-a-day smoking addiction, for several years afterwards I would have dreams of being able to smoke "just one" and not re-entrap myself. That was in the era when smoking in the workplace was still accepted. Interestingly, after being a non-smoker for a few years, I discovered that I was allergic to tobacco smoke. When someone lit up in a meeting, my eyes immediately began to water, profusely, followed by uncontrolled sneezing.

The main problem is that the camera companies use copyright law to prevent any third parties from developing software for their cameras. Sony freaked out to the extent that they disabled Sony cameras from even running Sony's own apps.

Can you imagine what an imaginative developer could do with the information from the in-body image stabilization system? Using that and a series of short exposures of varying lengths and you could radically reduce image noise and any remaining camera blur to the point where the resolution is greater than the physical sensor. Hasselblad probably has a patent on the ability to refocus based on camera movement and the Pythagorean theorem but I'm pretty sure that has expired by now and it would be easy to implement.

A not very clever developer could even set the camera to take the photo the moment before the shutter is pressed. It's laughably simple to implement in any camera that has an electronic first curtain shutter mode, and even simpler in silent shutter mode. All the phone cameras do it.

How about the raw information from the phase detection ( autofocus ) pixels? You could build a depth map but Sony is too afraid that someone else will put out a feature on a camera that they have already sold. Just stick it in the RAW file and I'd be happy.

Or how about an "I want to focus on the closest thing in the frame no matter where it is, and have x inches DOF behind it". I'd call it SX70 mode, the only autofocus system that worked the way I wanted it to.

A mode where the foreground, someone's head, for instance, has a short exposure, and the background has a long exposure, and each has its own color balance. It's a piece of cake with a film camera with a leaf shutter and a strobe, not much more difficult in photoshop.

Oh, and would allowing me to program the strobe to fire on every other exposure be too much to ask? My Vivitar 283 is getting kind of beat up.

A mode where the camera makes a preliminary exposure and then based on what the user sets as an acceptable number of blown-out pixels, set the exposure for the highlights, and let the shadows fend for themselves. That would be ridiculously simple to program.

Would choosing which color channel to base the exposure on be too much to ask? Yes, it is a limited use case but when you need it you really really need it.

Oh and Sony in particular: That feature which automatically switches between the eye-level finder and the rear screen? I bet it works for a lot of people but when it doesn't do you have any idea what a pain in the ass it is to go two menus deep to move the display when the menus are on the display you can't see? Assigning that to a button would be nice.

BTW, if you really want to hear some colorful language ask some farmers what they think about the abuse of copyright law by tractor companies. Who knew that John Deer's middle initial was F?

"the exact opposite of what portrait photographers try to do—it's the most un-flattering picture of me ever!"


Arnold Newman begs to differ

Long ago on this forum I suggested an "open source" camera, with a basic operating system that controls camera functions and lots of memory for developers to create imaging systems. Developers would have directions for creating apps to run on the phone and a forum to sell them to others. Sound familiar? It's what Apple and Google do with their phone operating systems. I'll bet this would generate some spectacular results. Maybe somebody could convince Oly's new owners to produce OS data on one model (E-M10 MkII?)to see what happens.

I also remember having discussions with somebody (Ctien or Thom Hogan perhaps) about creating a super camera using multiple phone cameras to create an "insect eye" lens that could process all sorts of data - 3D, multi-focus, etc. Those cameras are really cheap and phones now sport a couple of them to gain focal length range but a phone with a dozen lenses would not be mass-market for the usual phone manufacturers.

[I'm afraid "Oly's new owners" is mostly a shibboleth. The purpose of the new ownership is to dismantle the business division in an orderly way according to Japanese law. They might continue producing and selling cameras as a brand label, but it's very unlikely there will be anything like the research and development and the generating of new products that we were used to from Olympus. Disclaimer: I'm just a lowly reporter, not an expert on business. But this is what it looks like to me when I look into it. --Mike]

"Computational photography" mostly strikes me as a way that technology-obsessed people hope to insert creativity into their photos. Doesn't really work that way.

The Sony RX100V has the option to meter for the highlights. I use it and adjust the exposure to automatically add a third or 2 thirds exposure. This helps avoid very poor shadow elements and for night shooting maxs your shutter speed which combined with the inbuilt stabiliser is a wonder.
Alternatively bracket the shot with 'over exposure'.

Regarding the setting of a depth of field control Minolta patented that. I lusted after eye following focus (which Canon had on EOS 3) and a facilty to define a shallow/large depth of field.

The real power of the phone camera is the internet, not its software tricks. Our major means of sharing photos does not require high quality photos, so a phone photo, a m43 photo, a full frame photo and a medium format photo look the same to most people on Instagram and Facebook. The phone wins three times: by being the only camera most people need and have, by being as good as other cameras given our sharing medium, and by being far and away the most connected to the internet.

I've long wondered why Olympus, a company that seems to have relied upon innovation to achieve whatever market penetration it has, didn't come up with a "computational photography" version of its cameras. They've done some good things with automation but never really seemed committed to it..

In most industries it is quite common to emulate your most serious competitor, trying to outdo what they do best. But in the camera business it is almost as if the smartphone isn't considered a serious competitor. Yet it is clearly gutting the camera industry.

The Alice, mentioned by someone earlier, is intriguing in this regard as it appears to be a serious attempt to marry the advantages of cameras (large sensors and better optics) with the tech of phones. It will be interesting to watch. But why isn't some major camera company doing this?

"But who cares about those cameras now?"
Me. I've a special fondness for the big old D700, and that's despite having newer cameras.

Adobe recently hired Marc Levoy, of Google Pixel fame, and says they are working on a "universal camera app".

I can only imagine this may something of an ACR-styled iOS/Android app that would connect over the air to our camera of choice, on the fly, and supplement the functions with the joys of computational photography.

Someone on here said they wondered why Olympus didn't come up with some sort of "computational photography" setting or version? Hell, I have an Olympus Pen F digital I can't even figure out how to set! The most confusing thing ever! Possibly the perfect example of engineers making something without ever taking a panel of professional photographers opinions into account, as well as junking it up with dozens of setting (some which can be confusingly set two different ways), just because they can and it costs not much more to do that building a streamlined professional software platform!

Having said that, I've been saying for years that my sisters iPhone on "auto-everything" makes much better color, hue, and contrast decisions than the last professional digital Nikon I had on "auto". I was amazed that her "snaps" walking around India had the look and feel of something I'd really have to work at with post photograph computer massaging (something I hate about digital, since I used to be perfectly happy with the transparencies I shot pre-digital, with nothing else to do)!

The comments to this entry are closed.

Portals




Stats


Blog powered by Typepad
Member since 06/2007