« One Crazy Camera | Main | Jimi Hendrix Would Not Be Famous Today (Music Notes) »

Tuesday, 05 July 2022

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

I have always loved a big piece of light-gathering glass, and a big sensor to go with it, but computational photography is the future. Maybe a mirrorless camera could be made to do something similar, but camera-makers can afford the software & hardware R&D it would take to effectively compete with smart-phone makers. There are wonders to behold as any technology becomes more widely available (see either of the David Attenborough night-vision TV shows on Netflix or AppleTV+), but our idea that photography as a "decisive moment" snapshot is fast becoming a historical perspective.

"Does Night Mode Turn the Tables?"

I guess it depends what's inside the framelines

Of course the killer feature of smartphone cameras is that they take up no space (assuming you'd be carrying a phone anyway). No dedicated camera can hope to do that. Even as discrete cameras, their portability can't be beat.

Speaking of cars, have you noticed how many cameras new cars have built-in? Many current generation safety features seem to involve cameras and/or radar (and computers): lane-keeping assist, collision warnings, blind-spot monitoring, top-down 360-degree views, attention monitoring, etc. Some of this is even showing up on base model economy cars.

So, how about some car camera reviews? ;)

Seriously, though, I would not be surprised if someone has figured out how to hack these on-board cameras for personal use. At least until the next software update. Oh, yeah, another way cars are starting to resemble smartphones is over-the-air software updates (arguably a safety feature as well).

For quickly shared sunset/sunrise photos, or anywhere HDR is a real advantage, I find my phone often better than sharing the raw out of camera photo for getting the mood across. Night mode is another big win, so long as the subject isn't moving.

I would love a standard for cameras similar to Android Auto or CarPlay, where your phone could have access to the camera and replace/augment it's processing. It could really be a nifty best of both words, with a minimal interface camera for the purists with a set of apps running on a linked device for folks that wanted deep control of every nuance.

The difference is that the 'bigger' cameras these days are mostly digital, so they will advance as well. Computational photography is now already an important part in the advanced AF systems which will recognize a birds eye even far away!
So in my opinion the dedicated ILC's will always keep some advantages.

My guess is that the bulk of interchangeable lens cameras will increasingly be marketed towards high end users, and specifically wildlife and sports photographers, where phones might never compete. It seems to already be happening. Those of us who like to wander around with a nice small prime are increasingly outliers, and anachronistic (I just started using Nikon’s one cheap wide prime in Z mount, the 28 2.8 styled to look like it’s from the eighties perhaps. It’s actually a lot of fun). Shooting with a dedicated digital camera instead of a phone will be like listening to vinyl. Not something you have to do, but something you want to do.

Too bad for you and others that the Leica M 10 Monochrom costs a fortune. It’s a low light monster, the best of all Leica cameras in that regard. The color part sucks, however, although that’s a plus in my book.

The "nightmodes" multi image computational image technique was actually a Sony first introduced on their Nex cameras in 2010 when in "Hand Held Twilight" mode. At the time I thought this was revolutionary and pointed to the future of photography. I never understood why Sony didn't make much more of this feature or work on developing it further. It was available on Sony cameras for many years and then dropped.

It's just a matter of time before this becomes a standard feature in DSLRs. The question is when.

I think Kirk Tuck was once asked to test drive a Samsung camera with texting capabilities. I don't recall specifics, but if the camera was OK, perhaps it was ahead of its time.

I think you're right, and that we are on the cusp of a new era of computational photography. IMO, the OM System Olympus OM-1 (can that be right?) is the first real shot across the bow of the phone cameras. It will be interesting to see the extent to which computational photography will be required to be executed in camera against the extent it must or is allowed to be undertaken in post.

I was quite jealous of Night Mode users when I had an XS, and it was one of the reasons that I upgraded to a 12 Pro after just over two years. Interestingly, I’ve hardly used NM since - just not my thing, it seems. Also pandemic restrictions have meant little or no travel for the last couple of years, so not many nights worth recording!

I have found that I use ProRes RAW quite a bit, and also Live Photo. It turns out that you can do fun things with Live Photo images in Photos. Long Exposure is especially nice. Going back to Raw, the resulting image files are (as you’d expect) much larger than the non-Raw files. I’ve found my Raw images tend to be around 25Mb, as against 2Mb to 5Mb for non-Raw.

I haven’t used Portrait mode very often, but I do use Pano mode quite frequently; after a bit of practice, it’s surprisingly effective.

Do you use iCloud Photos? I do, and I find it invaluable. Saves storage on your local devices.

I have for years been wishing camera manufacturers would take just a bit of inspiration from phone camera tech. My android night mode (I assume Apple works similarly) can take a dozen or so photos in a couple seconds, automatically align them, and spit out a single RAW file for me to play with in Lightroom mobile. If my ILC could do that I would be over the moon, if my ILC could even take 3 or 4 photos in a second and automatically align them into a single RAW file I'd be over the moon!

I like to snapshot the star. Sony a6xx all can. Iphone up to 11 max is really not up to it. Wonder whether 13 change that game.

In re Phone as Camera: To summarize, the electronics, software and lenses are up to snuff, and the RAW output opens a new world of post-processing possibilities.

Now they have to do something about the terrible form factor. A machine designed to be thin enough to carry in your back pocket, then held up against your ear, and viewed in your palm, lap or table top, is a lousy machine for taking pictures.

Especially so for me, who suffers from essential tremor. The required pose for taking a phone picture is exactly the perfect position to bring out maximum jerkiness. Can't bring the phone closer to my body because of my presbyopia; I'd have to put the phone away, take out my reading glasses, and start again. Having to jab the screen with my fingertip to take the photo also knocks the framing out. Far from ideal.

Alternatively, if the phone was shaped like a box with a handle, held against the front of the skull, with the elbows tucked in, we might have something more tremor-proof. Maybe you could put in some sort of "viewfinder" with built-in corrective lenses for people with weak eyesight.

And hey, while I'm dreaming crazy fantasies, why not make the lenses removable, so you could, I dunno, what's the word ... inter-change them with extreme wide angle, or telephoto, or super-close macro add-ons. Just imagine the pictures you could take then! You could have a whole system of them...

Just spitballin' here. I know it's technologically impossible for such a thing to exist, but a fella can dream, can't he?

“ … will there come a time when "big" ILC's by traditional makers and electronics firms can't match some of the "miracle" features found on smartphones? ”

That time has passed, Mike. While shutterbugs argued over megapixels, RAW files, lens sharpness and speeds, and “low-light” performance the imaging engineers working on phones learned to process images with military-grade AI recovery algorithms and to leverage communications tech. Your Fuji may be able to records gorgeous images under good conditions but that iPhone 13 Pro offers much more application versatility.

It’s today’s broader world of photography, Mike.

—-

To John Krill’s recent question regarding max print sizes for iPhone images, the largest I’ve printed them is 13x19. And they’ve held-up just fine. I’ve seen much larger phone camera prints but cannot claim they haven’t been at least up-rezzed. Of course printing isn’t a major objective for today’s phone camera technologies … at all.

To your recent wonder if Leica might connect with a phone manufacturer, that moment also has passed…twice, actually. Most recently see the “Leitz Phone 1”, only available in Japan.

I've wondered for a while why camera makers aren't using some of these aids. There are things they do like focus stacking and exposure stacking, but they seem primitive compared to what phone cameras are doing. Zeiss tried to get in going straight from the phone to social media with the ZX1, but again, in a clunky, primitive way (not to mention its absurd price).
It would be very interesting to see a camera with the low light and exposure/color capabilities of a phone with the resolution and various lens characters of a camera.

[I heard a rumor a few years ago that Apple has 100 software engineers working on cameras and Nikon has six. That's NOT a fact, just a rumor. But something like it could have something to do with each company's relative strengths, I suppose. --Mike]

@John Holland: You might find it easier to use the iPhone's Volume Up or Volume Down buttons to trip the shutter. When the iPhone is held horizontally, with the volume buttons on the top side, the buttons fall naturally under your right first finger. Simply squeeze down, with your right thumb avoiding the lock button on the lower edge, and the shutter will be tripped.
Alternatively, you can flip the iPhone end-over-end so that your left thumb trips the shutter via the volume buttons.
I hope this is helpful for you.

There's an interesting historical symmetry here. The early twentieth century saw numerous mass producers of radios, instruments and other items start to dabble in small affordable cameras, and thanks to consumer demand become dedicated camera makers or spin off camera businesses. Lately, something of a reverse trend is going on, where casual photography has largely moved to non-dedicated consumer electronics made by non-specialist manufacturers (who sometimes even absorb camera companies) while camera makers and divisions are catering to an increasingly professional/esoteric market.

Yet both tides were similarly driven by accessibility, portability and ease of getting "good enough" results. The other similarity is that "good enough" kept getting better, or in other words became "good enough" for more and more people.

In my hazy haste I neglected to address the topic of “night mode”/low light photography. My answer to your question: Yes, it will “turn your tables! Over the years you have often posted lovely images from low-light scenes. I think your new iPhone 13 will put you in a fearless heaven. (Don’t forget to dust-off your other cameras every once in a while 😂)

A hukilau on Maui: https://www.kentanaka.com/#3
IPhone 12 Pro Max, mostly sooc. Yes, that’s the moon and stars/planets to the left.

My family and I returned yesterday from a trip to the North Rim of the Grand Canyon and Zion National Park. With my iPhone 13, I was able to take photos of out trip, the jaw-dropping scenery we saw, and photos of us together taken by strangers willing (and competent!) to use my iPhone to do so. Then, even in those remote places, easily send them to family and friends. Nice.

Mike, your "rumor" that only 6 Nikon engineers could be spared for firmware development while software for camera functions at Apple attracted 100 makes a lot of sense to me as a cultural (and business) consequence of the differences between the ever-growing phone business and the fading camera business. First of all, the A13 processor used in the iPhone 11 of 2020 was specced at 160 Gigaflops, precisely 1000 times the speed of the first supercomputer, the Cray 1 of 1975-80. The Cray used data words of 64 bits, while camera data is mostly 16 bit, but 1000x improvement in 30 years is a doubling of raw performance every 3+ years, almost what Moore's Law promised. And since another three years have passed, we might expect the iPhone 13 to have another two times the power of the now outdated iPhone 11.

My Moto G Android phone, which cost about 1/10 of the latest iPhone, probably has about 1/10 the processing power, so that's the range to expect. It takes OK pictures for instant upload, too, but has limited dynamic range in its tiny pixels, and lacks the oomph to make up for this by combining multiple images.

I can't imagine that camera companies, who have spent the past 70 years (or more) optimizing lenses, autofocus, shutters and first film then data movement, is eager to invite this new breed of software programmers into their house. Leica, with its very small market, has learned through collaboration with Asian manufacturers, such as Panasonic. Their "system on a chip" (SOIC) image processors apparently have an older generation ARM microprocessor onboard. And the in-house firmware effort seems to have focused on finding a common user interface for all three current lines of interchangeable lens live view cameras.

See the "Alice" camera for a creative take on a soon available blending of the strengths of smart phone and ILC using micro 43 lenses:
https://www.alice.camera

I understand that such technical feats are impressive and open up new opportunities.
But how good are pictures when they're "better" (or rather: "different") than what our own eyes are able to see?

Or when cameras always automatically "beautify" them, in more or less subtle ways?

A few days ago, Jim Grey wrote about his iPhone camera rendering the skies quite different from what he remembers. See https://blog.jimgrey.net/2022/07/05/under-fake-iphone-skies/ .

I've never used an iPhone so I cannot compare it with my own experience, but do I find it disturbing.
I take pictures to remember things I've seen or done, and when the pictures are so different, it feels as if they start to mess with my own memories...

The comments to this entry are closed.

Portals




Stats


Blog powered by Typepad
Member since 06/2007