« iPhone: I Did the Deed | Main | Does Night Mode Turn the Tables? »

Sunday, 03 July 2022


Feed You can follow this conversation by subscribing to the comment feed for this post.

The iPhone 13 Pro is quite a fine camera. You might be interested in this post: https://richardman.photo/2022/07/aquarium-cfv-ii-50c-with-different-lenses-vs-iphone-13/

As you say. 'I would normally "unsharpen" this a bit to get rid of the edge artifacts'. This is my main moan about photos coming off my iPhone 12 Mini. The sharpening is excessive. They look better if sharpening is dialled down a bit using a filter in Lightroom.

Something else that phone can do amazingly well is multiple-second exposures hand held. While you're exposing it overlays a crosshair to help you keep the camera still, then merges the result at the end. It's really impressive.

Also, if you're after a bit more control over the camera while shooting, the Halide app is nice, and has features like an RGB histogram, manual white balance, and a box for helping you hold the camera level.


Look for Polycam app and try the LiDAR 3d scanner built into the 13pro.
Wait until you try that!

Few know the technical reason why late model iPhones take such nice looking images. I’ll share the secret. Magic gremlins in the processor are responsible for the surprising quality. Actually what I like is that the images are delivered via cloud to both my iPad and iMac for larger screen viewing and editing.

Now turn your iPhone into a camera. This ships in about 2 weeks (I have no connection other than ordering one during the Kickstarter). https://shop.fjorden.co

Remarkable. But I'm confused. The "experts," especially the ones on an infamous D reviewing site, claim that any sensor smaller than "full frame" is incapable of serious photography. So your results must be impossible or some sort of trick.

I've downsized to the 13 pro from an 11 pro max already a very decent phone.
Happy with the size except the smaller keyboard is challenging and has led to more typos, combine that with auto correct and some sentences end up being little short of gobbledegook.
Could be my technique (technique might be embellishing it).

The camera is a treat, lots of fun at twilight.

Took a shot at my local servo, lights of the servo, traffic and traffic lights in the background, sun just below the horizon, was very impressed with the clarity and camera's capability to deal with various sources of light and dark.
Completely agree with your statement,
"I've seen better," but stay in context—this is a phone, taken handheld"

This phone takes better images than some photos that are considered classics, there are other reasons classics are classics but nevertheless.

This might be throwing a big log onto the fire and creating lots of sparks, but here goes. How much of the image quality (and I admit these are pretty good quality images) comes from computation photography techniques, rather than the optical qualities of the lens and sensor technology? In other words, the camera computer deciding what 'should' be there as opposed to the image created by lens and sensor. Then again, there's always some computation in a digital system, so maybe the question doesn't matter any more because it's just more of the same.

I’m very glad you updated your iPhone, Mike. As you can now see first-hand the change in imaging tech since your old phone has been truly remarkable. I encourage you to experiment not only with its enhanced abilities to -describe- under newly expanded boundary conditions, but also to express in new situations. Don’t be a slave to the purism of the holy grail of raw image files! Learn to work -with- the camera system’s formidable built-in AI tech to achieve results that might have otherwise required long post sessions on PS or LR. I really think you’ll enjoy it.

(p.s. Poor Butters! July 4th is the worst holiday for many pets.)

So nice to see your photos. Really like the house in dusk photo.

There's no getting around it, Apple has made amazing strides in upgrading all aspects of iPhone photography. It used to be that a 100% view would just present mush. And looking at the same images anywhere near pixel-level was like looking at a bowl of multicolored Froot Loops.

Other phone/camera manufacturers may have done as well - I don't know, having no experience with them.

I have a lot of nostalgia for the old days, when a photographer was a specialist with equipment and skills far beyond ordinary mortals. But it's far past time for us older types to wave goodbye to history and embrace the new and all it has to offer.

Soon the choice for a camera will be either for a smartphone digital one or a film camera.

I was recently forced to upgrade my phone as well, and the Pixel 6 Pro camera has really freed me to hike with less weight. It's amazing what is possible now.


Got new toy, eh? That should keep you company for a while.
You mentioned, "Something a little funky is going on at the top edge of the sunlit portion of the solid stem of the lamp.." I would not have noticed if you did not bring attention to that.
When it acts funky, it is possible that the prototype of iPhone 13 is a.k.a iFone 13.
Share more images. I have a funny feeling that your blog has caught the attention of the big wigs at Apple.

Dan K

Apple has a crazy good algorithm for sharpening animal fur. I can *always* tell the iPhone images of animals from even the sharpest Leica glass I own, and not in a bad way. Computational photography is here.

Just the guts. I like these sorts of posts. No fluff and on brand.

I'm shocked at how usable those images are. So! So that's why people buy Apple phones. Ah.

Still too expensive on my value metric scale though. I'll drop 5K on a guitar and not blink. 2K on a chainsaw - who cares. But the cost of an iPhone, when it's something you can destroy by dropping it out of your pocket - nope. Bridge too far.

Enjoy! And yes, amazing…..


How large of prints can you make from these images?

[That's a good question. We'll see. --Mike]

Hi Mike,

Are you familiar with the Dall-E-2 AI project? If not, Google it.

AI is now at a stage that the computer can “improvise” and make up the details and contents in photos and we can’t tell the difference. That’s what Super Resolution in Photoshop is doing, what AI noise reduction is doing, at a very primitive level. Dall-E-2 takes it to a whole new level.

I can only speculate. I think this kind of technologies are employed in our phone’s cameras as well. Some details we see in the final photos are actually not captured through the lens. Just like Steve McCurry crossing the line, this is flirting with the red line defining what photography is.

Re: Prints ...

Far better than expected.

I have printed 13 inch by 19 inch with surprisingly good results. Available-light images made when the light available wasn't much, require a steady hand (better yet a tiny tripod) and some fiddling in-phone.

Black and white prints can be stunning.

The comments to this entry are closed.



Blog powered by Typepad
Member since 06/2007