« Essays Odd Wondrous and Strange | Main | Open Mike: Teavivre (Tea Update) (OT) »

Wednesday, 14 November 2018


Feed You can follow this conversation by subscribing to the comment feed for this post.

Seems to me we are in need of a new kind of verifiable secure image format for journalism, a file that cannot be manipulated without it being obvious. Maybe a kind of "bitpeg" like bitcoin, but easier to generate.

If the past is any guide we will first see this in porn, and then used for political ads and also for that poetry of the internet, memes.

Still, I think we'll work it out. History is written by the victors, but it is rewritten by survivors and the newly empowered, much to the annoyance of the previously victorious. No matter what you use to record the battle, you have to stay in the fight.

I sort of agree that digital is different from film and deserves a different name. How about Ephoto for electronic photos? It is still recording light but the effect of the light isn't 'written' on the sensor, it is necessary to electronically capture the light from the sensor to separate media in order to save it.

The Guardian also said: You thought fake news was bad? Deep fakes are where truth goes to die https://www.theguardian.com/technology/2018/nov/12/deep-fakes-fake-news-truth

In May, a video appeared on the internet of Donald Trump offering advice to the people of Belgium on the issue of climate change. “As you know, I had the balls to withdraw from the Paris climate agreement,” he said, looking directly into the camera, “and so should you.”

My favorite philosopher George Carlin said: “Think of how stupid the average person is, and realize half of them are stupider than that. What we need to worry about in the age of deep-fakes—how many mouthbreathers will be faked-out by deep-fakes?

Thanks for the pointer - haven't got there yet - but I wonder if the QUIRK in question is due to search engine optimisation for online articles, and eyecatching titles for printed pages?

( I've been pondering a NYT subscription for a while, but I have enough stuff that I don't do justice to already )

The new medium?


Those who practice it are Pixelographers.

Just to help a little bit: This article is in the current issue dated November 12, 2018.
(the link didn't work for me)

How's this for cranky old lady:

It's only photography and a photograph is it's a PRINT. (Thank you, David Remnick.)

The source doesn't matter- film, digital file, scanned film (which is a digital photograph, after all); all that matters is that it's an object.

All over California these last weeks, the one thing that so many families took with them in their desperate evacuations from a hellish fire is their collections of photographs, all of which are physical objects. Photos are those wonderful THINGS that you hold and look at and interact with. Photos are COOL!

Did I do OK with my level of cranky? ;^)

"Afterimage"... An interesting, relatively disconcerting article which brings forth the huge grayscale history of photographic "truth" vs. fiction. I entered the reading with a lifetime's worth of creative photo retouching experience and left with 15-minute bus ride to the future.

Mike - dead link!

Good article you linked to. In my career as a software developer, some things were possible and others were Hard. With a capital H. As in, can't be done. Removing a person from a photograph to show what was behind was such a problem. Computers can't do such things. Increasingly we're seeing that they can. Or rather, people can. The people who do the research and write the software.
Nowadays, in my part-time role as a museum demonstrator of historic computers I'm often asked by visitors what computers will do in the future. I always tell them - whatever you can imagine, they'll probably do.

Fake smartphone photos-

Back in the 2002/2003 time-frame, I had transitioned to digital because it was so obvious how much more control it gave photographers, both with respect to being able to change ISO from frame to frame, but more importantly, control over parts of the tonal range rather than just the global tonal range (not including burning, doding and masking techniques here, which were highly artisanal and highly variable and therefore irrreproducible, statistically).

To say nothing of the fact, that for photojournalism, digital had become a "must-be" requirement for delivering photographs for deadline press.

I was taking some photography classes and participating on the Olympus OM mailing list at that time, and there was an incredible amount snobbery about film vs. digital back in those days. The Olympus shooters were saying that digital was fine for snapshots and tourists, but serious photographers shot film and would always shoot film because "digital will never match film" for image quality. I heard comments from my photo class members that "film is sacred" and was introduced at photography exhibits, as, "This is my friend, Stephen. Shoots digital", with an all too- obvious tone of disparagement. The tone and tenor of the discussions reminded me of the "art is not photography" debates back in the early 1900s and which led to the "pictorialist" movement to provide it with credibility.

Well, we all know how that turned out. Kodak and Hasselblad collapsed from corporate complacency and being caught out by the disruptive innovation of digital as described in HBS professor Clayton Christensen's in his seminal book on innovation, The Innovator's Dilemma. The dearly departed Michael Reichmann wrote articulately about this on LuLa over a decade ago.

It all comes down to this, and I used to teach this key principle to my Design for Six Sigma students. People get hung up all the time on current functional embodiments as being the "thing" under discussion, e.g. "that the word photography described what we now clumsily know as analog or optical/chemical photography".

This is quite simply not true. Functionality is functionality, y=f(x).

Functionality is technology and embodiment independent.

There are almost always different ways to produce the desired functional response. Often times, many different ways. This is why both a vacuum tube and a transistor, which are very different methodological embodiments, provide the exact same functional response, gain in an electrical circuit. Understanding this was also key to the development of Genrich Altschuler's "Teoriya Resheniya Izobretatelskikh Zadatch" (Theory of Inventive Principles), aka TRIZ.

You put provide an input modulated by control factors into a "system" and you get out of that system a functional response you care about.

When you look at "photography" from a systems engineering functional decomposition analysis perspective, there is no difference between a heliograph, daguerrotypes, platinum prints, silver halide paper prints, film, metal plate or a CMOS sensor.

At the end of the day, these different approaches to photography are all variants of ways to deliver a transfer function producing a functional response.

In this case, capturing light on a sensing medium to render an image.

Well photography’s truthiness has been questionable virtually since its invention. The chemical era was no more honest than the digital era, just a bit harder to manipulate convincingly.

But, of course, this NY article goes far beyond photography and simple digital editing, into digitally synthesized imagery made possible by today’s stunningly powerful computing and graphic display technologies. (BTW, much, perhaps most, of the car ads you see are generated from a mix of live footage and 3D model graphics. Car photography is nearly kaput.)

Side rant: Although I’ve subscribed to the NewYorker for some years but I become increasingly irritated by their general tendency to write to write. On and on and on....often pointlessly. Anyone else feel this way?

The link to https://www.newyorker.com/magazine/2018/11/12/in-the-age-of-ai-is-seeing-still-believing is broken,

I think that the 50 or so year period where people assumed that photographs were true , that "the camera doesn't lie" are going to be seen as a strange anomaly born of industrial photofinishing for the consumer market where people knew that the photographic image was immutable.

On the other hand, lest anyone think for a moment that any of this is new, this is what press photography looked like in the 40s and 50s.

Obvious to us but apparently not then.(yes I photographed a print in color that was intended to be photographed by an orthochromatic stat camera for a plate, so it is exagerated, but that's the point)
The question though is, are these photos less or more true before or after the manual editing?

Dear Teach.......
Very good article, and much appreciated.
Calling interesting things to our attention is one of my favorite aspects of TOP. That includes the fact that the comments often add significantly to the post.
It is only Schoolmarm like if you tell people what they ought to think.
You go out of your way not to do that.

Re Digital Photography being different from the Photography we grew up with, -- I remember the hubbub well.
Digital Photography is "Photography Extended" and more than that it is 'Photography Continually extended' often is ways that none of us contemplated at the beginning of its digital makeover.
I don't think you are getting a new word for it though, there are too many issues. One of which is that for some people, there is NOT a lot of change, they practice it in much the same way as they always did. Yet for others , you are completely right it bears little resemblance to what we used to call Photography

It may be that we will have to wait, and like Pluto be 'defined' differently in retrospect.

What is interesting to me , whatever we call it, Photography has become multi-polar with what we would call 'Traditional Photography' being only one of them. People who have grown up with Facebook Et Al, see it differently -as a substitute for or extension of language, with the same speed of communication.
Other forms like Stochastic Photography , or Machine vision , or A.I. and others take 'Photography' off in directions we never dreamed of.

It also plays out on the manufacturing side, with ,for example, Apple defining Photography differently than "Traditional Camera companies" do.
As more people define photography in Apple like terms, fewer of them are likely to buy cameras from manufacturers with narrower definitions. It's already happening Apple sells more cameras every year, and Camera companies sell fewer. Everyone keeps predicting an equilibrium, or a halt in the decline of camera sales, but it hasn't happened-- even at a time of increased consumer spending.

You are probably right about needing a new word, but you can't do that without making lots of people feel like Pluto

clumsily know as analog: You opened a can of worms but since you included 'clumsily' no harm no foul.

For me Analog is not a term that should be applied to FILM photography. My reason for this is simple. Both film and digital cameras have analog collectors of photons. The digital camera has a electronic sensor that is just a collector of photons. Only when you move those photons from the sensor with analog-to-digital logic do we now have digital information.

There are so many common parts to film and digital cameras that only with the processor in a digital camera can we differentiate the two technologies.

Photography for AI is simply data for ingestion. The methodologies and heuristics that are programmed into the machines are intended to produce certain goals or outcomes-i.e. decipher license plate numbers. I think this has little to do with the human use of photography. Regarding authentication of a digital image, if (yes, if) digital generation is indistinguishable from capture from nature than the problem would be the need for some kind of identification system encoded and encrypted into its digital stream. In this way, I think the distinction between DI and photography has some merit. But that's just a technical issue. All of this misses the forest for the trees, which is about perception and meaning.

Regarding objectivity, what exactly does the author of the New Yorker piece believe this is? Does it even exist? (See Carlo Rovelli and his ideas about perception and time - the key ingredient of a photograph ignored in the New Yorker article.) At best, objectivity is a matter of degree; selection, framing, etc does not necessarily make a photograph more subjective than objective. To make such a claim is categorically absurd and a good example of faulty first-order thinking.

The irony is that, of course, we live in an age of image saturation and, I would wager, view images as authentic artifacts in a much more naive way than people did 50 years ago. Why? Because so much of our reality is mediated by screens - which we have to accept as authentic in order to make them do what we want. As a result we are more susceptible to view digital images uncritically. Visit any museum and observe people take cell phone shots of art instead of actually perceiving it with their eyes. The object doesn't become "real" until they've "captured" it with their phone. Black cat, meet kettle.

Photography, as practice, is a process of learning to see (which means being aware of one's blind spots so as to see more clearly). Anyone who has taken photos seriously for any length of time becomes aware how their old images ignored areas of the frame that had a greater impact on the overall image than they'd realized. They were blinded by what they wanted to see. This is an analogue for most things in life. AI is about as useful in this endeavor as a plastic spoon is when in the need of a deep ditch.

I can't get the link to work on the little iPad, but no matter: you have touched upon something very close to my own perspective about digital capture and image production.

I feel it to be so distant from analogue as to be something quite else that also just happens to result in an image.

My initial love for photography was all about the magic of the latent image, and transforming that from just a camera click into something that was solidly there in the form of the negative. (That security has been lost.) The next step was even more breathtaking as that picture slowly became visible beneath the fluid in the dish. It was something that never left me, however familiar the process inevitably became. There was always that sense of achievement, of having developed some special talent/style to and within one's work that other people couldn't ape. With digital, that's pretty much gone because today, all it takes is the patience to sit before the computer for as long as it takes to tweak a little here, tweak a little there, and if it doesn't work, just reverse your steps and have another go, costing nothing but time. If ever a monkey sitting at a typewriter could manage to produce Macbeth, that creature could learn to make digital photographs in ten percent of the time.

Basically, and this isn't the first telling, were my initial photographic experience to have been with a digital camera, I seriously doubt that I would have been inspired to continue learning. I found the adaptation from the one to the other to be rather laborious, and it took quite a lot of friendly Internet help before I picked up enough know-how to do what I want to do, which is simply what I wanted to do with analogue. I found film and wet darkrooms easy to understand, and today, without a lot of darkroom experience behind me, it would be very difficult to realise just how good a black/white photograph can look if you know what you are doing. I think this stems from the fact that with so much variation possible, with so many electronic tools to play with, the result is often confusion and the inability to know where to stop.

Less often really is, more.

Similarly, this article popped up in the Washington Post.

"Photography has never been just about capturing reality, but the latest phones are increasingly taking photos into uncharted territory."


"any comparison of film vs. digital quickly devolved into a status dispute, and people on Team Digital were immediately and automatically prickly about imagined slights to their standing. They wanted the main word applied to their chosen tech." Hah, so funny! On Dpreview, the "photographers" (if that is what you call them when you are being generous) are still battling that war. In fact, many of them treat film-based imaging as if it is not real photography.

Thanks for the link to 'The New Yorker' article. a most enlightening read.

The digital-is-not-photography argument raised heat because nobody likes what they do to be defined away by those who seem to not like it. That argument resembles strongly the photography-is-not-art wars of the deep and not so deep past.

As soon as the Photoshop operator starts drawing instead of projecting, then the photograph starts to become less like a photograph. But this is, of course, nothing new. What is the dominant image-making mechanism? Projecting the light from a scene onto a sensitized surface? Photography. We don't even have to argue about the meaning of words like indexical.

I'm with you on using italics for book titles. The font issue, too. But typesetting is not typesetting if hot lead isn't involved. We should call it digital lettering. Er...

"At the very least, there's a short list of people I think everyone interested in our subject ought to know"

Ooo... can we see the whole list? As a post? With some short description? Pretty please?

I would go with digital imaging if I never printed but I use chemicals to print now so I'm sticking with photograph. :-)


"I am but an egg," to quote another (fictional) Mike. Always open to learning, even from someone a generation younger 'n me. I appreciate your recommendations (not assignments).

And I'm with you in annoyance at the New Yorker's non-matching titles.

Fascinating article for our visual age. Will be interesting to see how society evolves to cope with this as the tools for video manipulation become increasingly approachable to the masses.

Full disclosure: I stated on this blog some time ago that "there is nothing new to photograph." I stand by that statement. (Not an original thought: the recently deceased Hank Wessel made that statement when I attended San Francisco Art Institute in the mid 1970s. He was trying to get his students' minds out of their eyes, and I believe that claim then, just as I do 40 years later.)

This article in The New Yorker is clarifying. Towards the end, we read: "But, actually, from the very beginning photography was never objective," Efros continued. "... we've been fooling ourselves...it will turn out that there was this weird time when people just assumed that photography... was true."

From the Yale Alumni Magazine: Question - Do think it is possible for the camera to lie? Answer (by Walker Evans) - "It certainly is. It almost always does." (Walker Evans at Work; Harper & Row; 1982)

About a decade ago, I obtained my MA degree in Studio Art at California State University, Sacramento. One of my advisors was the photographer Roger Vail. The other professors in the graduate program were painters and sculptors, fine people. No photographers (one New Media professor, though). "The first thing you will have to do," said Vail, "is to 'unlearn' (the other professors) from everything they think they know about photography." Roger was oh so right.

No medium in the history of human creativity has come so preloaded with expectations and demands as the still photograph. From the outset, due to its literal description of surface, it was imbued with the faith and belief it would somehow reveal the "The Truth." What a fallacy. It does anything but, for which I am eternally grateful.

From the New Yorker article: "Before Photoshop, did everyone believe that images were real?" Zhous asked, in a wondering tone.

"Yes," Ginosar said. "That's how totalitarian regimes and propaganda worked."

I'd say, myself, that if there is in fact a paradigm shift from believing photographs show "The Truth" to one of realizing they do not, and never have, that is a good outcome. One we, as photographers, should hail and celebrate.


According to the OED, the word "photography" from the Greek for "light" (photo) and "writing" or "delineation" (graphy) was coined by John Herschel in 1839 in a paper to the Royal Society. In that sense, I see no difference in applying it to images made from light passing through a lens onto a digital sensor - that is, my linguistic OCD antennae does not tingle. But I can see why the use of the word to digital images might be annoying to others (looking at you, Mike because the OED definition is "The process or art of production pictures by means of the chemical action on a sensitive film on a basis of paper, glass, metal, etc.; the business of producing and printing such pictures".

The different titles is to cater to search engine optimization. A short print title won’t work for Google, so you need a long descriptive title. Real pity, because great story titles are a joy to behold.

Aside from the naming semantics:

Sez you: "D.I. is closer to figurative painting than it is to photography."

No. The everyday amateur digital Canikon gets closer to photography than the everyday film Canikon did.

Yes, the digital result can more easily be manipulated into something further from photography, but the average film result from 30 years ago was closer to figurative painting.

(A carefully crafted 8x10" contact print might be a counter-example.)

Re back-formations:

We now have "outdoor cycling", as opposed to the more common pedaling on a stationary machine with a screen in your face to simulate cycling.

And I'm sure you've heard "reverse selfie", which involves aiming a camera AWAY from your face.

What Ken Tanaka said. On both points.

And regarding his side note on the : I could not agree with you more, Ken.

The New Yorker, much like Apple, seems to have become incredibly self-absorbed, and have become entirely inwardly-looking, focusing only on how "cool they are".

Having been interviewed by a number of journalists about my contributions to the invention of PCR (which won the person who conceived it, my colleague Kary Mullus, the Nobel Prize in Chemistry in 1993), I learned very quickly, rather than being strictly objective gatherers and reporters of facts, many journalists start out with an agenda before they have talked to anyone, and then only do the interviews and research that back up the agenda they had decided on at the very beginning.

"Really, D.I. is closer to figurative painting than it is to photography. Sez me."

SERIOUSLY!? They are not anywhere near the same thing. The amount manual dexterity required alone puts them in a totally different universe. Sez me.

Also, "digital imaging" is only slightly better than "digital capture". Both phrases are cold and heartless. "Photography" is timeless.

Have a great Thanksgiving.

The comments to this entry are closed.



Blog powered by Typepad
Member since 06/2007