Look what Ned Bunnell found...very interesting.
I won't give away the game because you should look at the first picture and see what you notice. Ned explains everything in the post at Instagram.
After wrestling with film and darkroom work for decades, I always had a strange affection for anomalies. I'd notice things that were often invisible to other people, like signs of bromide drag in exhibited prints. And like Ansel with his "black sun," I actually made a few pictures that featured technical accidents. One, for instance, features a flat-filter reflection that mirrored a full moon in the middle of a lake.
So far I haven't developed any affection for digital glitches and flaws, though. My problem with portrait mode on the iPhone 7 Plus (I can't afford to keep up with the very latest iPhone camera, alas) is that I can't seem to remember to turn it on. :-) But that's clearly my fault.
Mike
(Thanks to Ned)
Original contents copyright 2019 by Michael C. Johnston and/or the bylined author. All Rights Reserved. Links in this post may be to our affiliates; sales through affiliate links may benefit this site.
(To see all the comments, click on the "Comments" link below.)
Featured Comments from:
Scott Abbey: "Ha! Portrait mode appears to be no better at masking than I am!"
Tam: "I have seen this with those favorite iPhone portraiture subjects, cats. Shoot the photo in Portrait Mode at the right angle and with the right background, and the background gets computationally blurred...except a halo between kitty's ears."
hugh crawford: "This is funny.
"I'm surprised that this works so well. Or at all even. First of all it's supposed to somehow make a check that there is a person in the photo, then the depth mask on the X, XS, and XSmax , is one quarter the resolution of the photo i.e. 1/16 the pixels, and the mask is half resolution or 1/4 the pixels. The XR has only one rear facing camera (when did all film cameras become 'rear facing'?"), so it does not do depth maps except with the 'front facing' camera—you know, the one that points behind you. The XR makes do with an algorithm that recognizes a person-shaped thing with face-like features, and then does some really impressive edge detection to make its mask.
"My guess is that the new Sony time-of-flight depth mapping camera will show up in future iPhones to make this work in a less kludgy way.
"The first three minutes or so of this video explains it.
"After swearing off writing code for the last few years (except for a little program to lay out 6,500 photographs in a book) I got sucked into playing around with the Apple camera API this last week. Right now I am playing with a little app that saves the intermediate masks etc. to use in Photoshop.
"I have also been playing around with the new iPhoto application in Mojave, and it lets you muck around with the portrait mode fake DOF on saved photos, and even turn portrait mode off. This leads me to believe that all the masks and meta-data are included with the saved images."
Dori: "It happens with the Pixel 3 too. That's the current state of fake software blur. It might get better but don't expect it to be the same as an optical one. Just know the limitations of the software and keep those original files saved."
Moose: "'...Very interesting.' But not in the least surprising, at least not to me. As someone who regularly creates different bokeh, subject background separation, etc. by blurring backgrounds, I'm deeply aware of how difficult masking subject from background can be. When I read about this capability, I automatically dismissed it as unlikely to be reliably useful. I do notice how it applies a gradient to the amount of blur, which is also something I do, to emulate the differential in blur with depth. It's not badly done, but not ideal, either. Doesn't appear to be simple Gaussian blur, as that would not result in the edginess of the bokeh in the most details areas, the farthest background, top center, for example. On the other hand, I know many people who enthusiastically shoot away with their smartphones and are very pleased with results that I would immediately dump. Am I denying myself simple pleasures? Well, only in favor of the deeper pleasures of an image well captured and turned into a pleasing printed or web image. \;~)> "
adam palmer: "Another phrase for that is bad Photoshop. (I know it never made it to Photoshop but it's the same problem.)"
Unremarkable really, portrait mode is for portraits of humans, it doesn’t even work for most dogs, I presume because their faces are too long (horizontally, I mean). If it was ‘background blur mode’ it would have called that, or probably ‘iBokay’ knowing Apple ;-).
Posted by: Richard Parkin | Monday, 11 February 2019 at 09:27 AM
I can attest your uncanny ability to see details in pictures other people cannot. Once you were so kind as to comment one of my photos, remarking a small, but very important detail... that I had never realized myself. I felt awed -really- and humbled -really, too.
Posted by: Rodolfo Canet | Monday, 11 February 2019 at 09:46 AM
It would be an interesting sci-fi plot if the AI photo/video algorithms had some universal bug that distorted reality in a certain specific but obscure way that eventually meant that everyone would not notice anymore and therefore began to doubt reality precisely because it conflicted with a ubiquitous electronic virtual reality that had become the de facto standard. Nobody would believe what was in front of their eyes anymore.
Posted by: Robert Roaldi | Monday, 11 February 2019 at 09:46 AM
I have never seen anything like that with my Pixel. AI can handle this stuff, if done right.
Posted by: PacNW | Monday, 11 February 2019 at 09:53 AM
Seems his neural engines need a tune up. Maybe his light injectors are clogged?
Posted by: Mike P | Monday, 11 February 2019 at 10:08 AM
The iPhone's portrait mode uses an approximate depth map generated from stereo data from the two cameras (if you have two cameras) or by other means I don't understand (if you don't) to guess about what is background and what is not.
Obviously it's not always right. It's probably more right than wrong on things like human faces, which is where the feature is presumably most targeted. Early versions of the mode had particular trouble with the boundary between glass and air, which made for fun mishaps.
My favorite iPhone camera glitches are
1. In the very early iPhone cameras you could pretty easily get a rolling-shutter-like effect caused by the fact that the CCD did not read off the data all at once, but rather in a scan-line-like way... this was fun to play with.
2. In some of the newer cameras if you got unlucky you could get sharp but wavy images if at the time of exposure you happened to jiggle the camera just the right way. Apparently the sensor can move w.r.t. the lens assembly, probably as part of the image stabilization system...
I have one really excellent example of this one:
https://www.flickr.com/photos/79904144@N00/34568226370/in/photostream
Posted by: psu | Monday, 11 February 2019 at 11:09 AM
MyPhotographer's Prayer: 'Lord, please let my humble attempts at picture making start with a straight rendering of an image projected by a lens on a light-sensitive surface - and nothing else, before I start working on it. Thank you, Amen.'
Posted by: Hans Muus | Monday, 11 February 2019 at 11:15 AM
Having spent the better part of the past year effectively following your OCOLOY idea, with the OC being my iPhone, I have noticed a number of ComAnoms including the one mentioned in this TOP entry. I have also noticed an inverse ComAnom when using the Portrait setting/lens to picture flora ... if an arrangement of flowers has an isolated flower stem emerging from the main body of flowers and is set against a "busy" background, that stem will be, more often than not, out of focus along with the background leaving the blossom at the top of the stem in focus and "floating" above the main body of flowers. Apparently, the AI brain needs to be taught to think a little harder in order to get this situation right. That written, when the Portrait setting/lens is used for the purpose it is most likely intended, picturing people or cute cats and dogs, it works very well. Especially so with the post-picturing variable DOF slider feature.
Posted by: Mark Hobson | Monday, 11 February 2019 at 11:35 AM
Look closely at my cat's whiskers in this photo for a similar algorithmic boner:
https://flic.kr/p/Rbb5aQ
[Good one. No optical explanation for that. --Mike]
Posted by: Ben Rosengart | Monday, 11 February 2019 at 12:54 PM
The most interesting thing I found out about this post was Ned Bunnell's IG page, which has lots of wonderful photographs. I just followed Ned.
As for "the good old days" of processing film & printing with chemicals on paper....well, those days, thankfully, are gone. I don't know of any working pros that had to process film and print at a "production level" for commercial, portrait, wedding or editorial photography that miss the "good old days".
Good riddance.
Posted by: Stephen Scharf | Monday, 11 February 2019 at 01:26 PM
Solution: use a proper camera and stop playing with these phone-cameras. Leave that to our teenagers. Maybe it will tempt them to use real cameras with real lenses. Eventually.
Posted by: Andrew John | Monday, 11 February 2019 at 01:45 PM
I think the problem is not with the camera but with the plant. It is well known that Apple's portrait mode only works on plants that have eyes. So it works perfectly with potatoes but not worth a damn with tomatoes.
Posted by: Jim Richardson | Monday, 11 February 2019 at 02:03 PM
Is it Software, or is it Real............
Thanks Ned & Mike for an interesting post.
Just like fixing perspective error of camera position in Photoshop is not the same as using tilt, shift , rise, and Fall on a view camera, it can come close and is far easier and can be done after the fact.
The net effect is that we are seeing fewer direct, lens drawn images in that genre.
Similarly, some lenses are less corrected for certain aberrations because the manufacturers software 'corrects' it later. If you use that lens on a different camera, the results may vary.
It's a new world , What we see is Not always what we get.
I was one of those who disagreed with Mike when he suggested a differentiating name for Pictures made in the Digital Domain.
I still don't like the idea because Photography can still be done in a way that is consistent with previous norms, but I guess the point is that we have crossed a rubicon where it is now impossible to Know.
It is even nearly impossible for the Photographer to know because , increasingly, our pictures are pre-processed in various ways and degrees before we ever see them. So something IS different.
The new technology came bearing lots of wonderful gifts, most of us have been thrilled to accept them. They help us do lots of things easier and better than we ever could. But we need to be mindful of the fact that the gifts weren't free. We gave up a few things in return.
As I write this there is an 11x14 Deardorff with a Goerz Artar standing quietly unused right next to me, and posing the question "Have you deserted me because Digital is Better, or because you are Lazy"
My honest answer is that Digital IS better at many things, but not all, and yes I've gotten a bit lazy.
Posted by: Michael Perini | Monday, 11 February 2019 at 02:35 PM
I'll stick to bokeh generated by the laws of physics, not binary "logic".
Posted by: Robert Pillow | Monday, 11 February 2019 at 03:28 PM
As one who uses Photoshop and object selections a lot, I anticipated this glitch in smartphone camera software as soon as I saw they were trying to fake bokeh. All but the most trivial Ps selections require finessing, especially for features like existed in Ned's pics.
I predict this will never be fixed and will just become another anomaly we live with like the weird kaboing noises that plague cellphone conversations or the bizarre artifacts we're used to seeing in streaming videos.
Posted by: Bruce Walker | Monday, 11 February 2019 at 03:47 PM
In moderately the same vein of likeable flaws (though this one falls into the category of user error / poor technique), one of my old Holga shots picked up an interesting piece of dust during the scanning process. I had intended on titling the photo "Bike Love" anyway and this just seemed to fit so well that I left it in. Look just left of the leftmost bike's front tire.
Bike Love
Posted by: Adam Lanigan | Monday, 11 February 2019 at 04:29 PM
I'll also add to my previous comment... The Holga is a perfect example of learning to love the anomalies, mixed with likely an equivalent dose of cursing the anomalies when they upend an otherwise good picture.
Posted by: Adam Lanigan | Monday, 11 February 2019 at 04:40 PM
I’ve a new iPhone XR, which is single lens so it’s all about the software. The portrait mode, front and back cameras, works quite well mostly (I use the back camera because I still can’t help but wanting to maximise quality, regardless of the snapshotty motives). But it has its moments. This selfie didn’t like my headware much, apparently, though its choice of blurring is not without aesthetic value:
https://www.instagram.com/p/BtK6P4hly8A/?utm_source=ig_share_sheet&igshid=gmkbzlc354mh
Posted by: Marc | Monday, 11 February 2019 at 05:04 PM
Whoa!
Having acquired an XS recently after my 6s started showing signs of mortality, I am going to have to revisit my photos. I haven’t really used the XS camera seriously even though I chose it over the XR for the camera; I tend to favour the X100F or film cameras.
Posted by: Earl Dunbar | Monday, 11 February 2019 at 05:59 PM
I think there's a difference between the kinds of anomalies. In a photochemical print, except in some explicit cases ('solarization' / 'Sabatier effect, say), the photograph is usually some approximation to 'the truth' with some artifacts (I am gleefully ignoring colour changes here). In the case of the iPhone image the whole photograph is, clearly, an artifact, with occasional glimpses of what the lens actually saw.
Neither is better, I suppose, but I know which I prefer.
Posted by: Tim Bradshaw | Monday, 11 February 2019 at 06:13 PM
Interesting. This is a pretty ugly anomaly.
But I am rather fascinated with the anomalies from using the iPhone for panoramas.
Like this one :)
https://static.wixstatic.com/media/49fc4b_27274ab0ae9b406fb09f4199ada70eac~mv2.jpg/v1/fill/w_1920,h_811,al_c,q_90/49fc4b_27274ab0ae9b406fb09f4199ada70eac~mv2.webp
Posted by: Bruce Alan Greene | Monday, 11 February 2019 at 06:24 PM
If you can’t see it, it doesn’t exist 8-)
BTW a week ago my lolood pressure was 49/40. Today I walked 200 yards, rested for 5 minuets then returned. I don’t allow kvetching to interfere with my health.
Posted by: c.d.embrey | Monday, 11 February 2019 at 08:03 PM
I recently had an experience of this kind when I tried to turn a very ordinary picture of black-eyed susans into something more interesting. I pushed the shadows to black to obliterate some distracting details and injected a bit of false colour. When I printed the result, however, the black areas that I saw on screen emerged as a gorgeous dark blue, the flower petals were a vivid red, and some ghostly bits of green leaf floated faintly in the background. This was infinitely more interesting than the image I had sent to the printer, but it was clearly a result of a printer malfunction. I traced the problem to a recently replaced black ink cartridge. I had failed to remove the tape covering the air hole, thus preventing the ink from flowing. Great picture, but with zero creative input from me, just serendipitous incompetence.
Posted by: David Francis | Monday, 11 February 2019 at 09:13 PM
I heard about this issue late last year. I was doing a compare of the newest iPhone and the Google Pixel 3 and came across a review of just this issue. I've spent the last few hours going through every link I have saved, I have a ton of them, and can not find it.
From what I do remember of the article I saw, this is not just an iPhone problem. Most of the camera upgrades are more software than hardware and this bug is showing up more often always in the portrait mode.
I thought it was a Digital Photography Review site link but can't find it. But it did describe the exact problem with the portrait mode shown in Ned's Instagram post. As soon as I looked at his photo I spotted the issue.
If I can find the link I'll post it or send it to you.
Posted by: Peggy C. | Tuesday, 12 February 2019 at 03:03 AM
I’ve seen that flaw in my photos. Portrait mode is pretty bad at defining the edges of glassware also.
One commenter on that thread mentions an app called Slor that lets you edit portrait mode photos in a more robust way - changing focus point, aperture characteristics etc. I use a similar app called Focos, which has bokeh models for historic lenses that are fun to try. And I swear, if I go through the trouble of opening that app to take my portrait mode photo, it works a little better than the built-in app. But like you, Mike, I don’t often bother since I can access my iPhone camera app via the lock screen.
Posted by: emptyspaces | Tuesday, 12 February 2019 at 12:24 PM
It’s early days for computational photography / digital image making (I like the definition of a photo from ‘An hour with Sarah Meister’ - 1st Dec 18). It took Google many years of tweaks to refine it’s algorithms - including buying out those produced by technology students.
AI could do it, but it needs to learn what humans prefer in their images, which also takes time.
And we get to see the quirks of all this learning along the way.
Posted by: Not THAT Ross Cameron | Tuesday, 12 February 2019 at 05:31 PM
Finally found it!! Need to better organize my bookmarks.
This is the review I mentioned. It's from December 2017. Farther back than I first thought.
Beginning with photo 25 and several there after, it mentions the "depth map artifacts".
https://www.dpreview.com/articles/9861366973/portrait-mode-shootout-iphone-8-plus-vs-google-pixel-2
I found it an interesting review.
Posted by: Peggy C. | Tuesday, 12 February 2019 at 07:32 PM
I wonder if people will look back at optical blur and zoom in a section of the subject that is blurred and say wow the optical blur had a lot of limitations, most people could not match the depth of their subject with the depth of field of their lens... lucky computers took over
Posted by: alsyourpal | Wednesday, 13 February 2019 at 02:46 PM
Re David Francis comment - serendipitous incompetence - nearly snorted my tea over that one. Love it!
Posted by: Not THAT Ross Cameron | Wednesday, 13 February 2019 at 05:12 PM
The hot-rodders used to say, "Ain't no substitute for cubic inches." Perhaps the new version of that is, "Ain't no substitute for real glass."
Posted by: Steve Renwick | Thursday, 14 February 2019 at 08:05 PM
I see the latest apple ad has made bokeh a verb -https://www.youtube.com/watch?v=IKok5dykRBM
Posted by: steven ralser | Friday, 15 February 2019 at 05:48 PM