Sometimes I feel like referring articles is a bit schoolteacherish, like I'm passing out assignments. That is, of course, not my role, as I'm talking to friends here—my equals and betters.
But really, sometimes articles come along that ya just have to read. One such is "In the Age of A.I., Is Seeing Still Believing?" by Joshua Rothman, in The New Yorker. Pretty much a must-read for photography polymaths like you. (It annoys me that The New Yorker has evidently adopted another QUIRK—awkward "shouting" capital letters intended—to go along with its longstanding brain-dead practice of putting book titles in quotation marks instead of italicizing them like everyone else in the literate world. (Editor Remnick should have left the quaint old original typeface alone and changed that instead.) But at least that one is excused by being a "tradition," going all the way back to Ross or wherever it goes back to. The publication's newest QUIRK, with no tradition whatsoever, is to assign different titles to the same article depending on whether it's online or in the paper magazine—which is confusing and irritating. Anyway, in the printed magazine the same article is titled "Afterimage," in the Nov. 12th, 2018 issue.*)
Years ago I tried to assert that digital imaging should not be called "photography," that the word photography described what we now clumsily know as analog or optical/chemical photography (I usually dislike back-formations), and that the new medium was sufficiently different that we should know it by a different name. I thought "digital imaging" or D.I. served just fine, since that had currency at the time.
I've never changed that opinion, but I learned to back off on it, because people didn't like it—in the early days of digital, any comparison of film vs. digital quickly devolved into a status dispute, and people on Team Digital were immediately and automatically prickly about imagined slights to their standing. They wanted the main word applied to their chosen tech. So "digital photography" it was. As Mad magazine used to say, Yecch.
This article—whatever you want to call it, "Afterimage" or "In the Age of A.I., Is Seeing Still Believing?", really gets at what I was talking about back then. We're on different territory now. It's just that it looked like the same territory at first. It isn't. The changes are far more profound than just the means by which we fix a lens image to make a still picture. Really, D.I. is closer to figurative painting than it is to photography. Sez me.
At the very least, there's a short list of people I think everyone interested in our subject ought to know, and Hany Farid is on the list. If you don't know the name yet, the article will be an overdue introduction.
...Reports due Tuesday. Just kidding!
Mike
*I know, I'm getting to be a grumpy old man. (My friend Dan's joke—he's 65: "Every morning I wake up grumpy. I should just let her sleep.")
Original contents copyright 2018 by Michael C. Johnston and/or the bylined author. All Rights Reserved. Links in this post may be to our affiliates; sales through affiliate links may benefit this site.
B&H Photo • Amazon US • Amazon UK
Amazon Germany • Amazon Canada • Adorama
(To see all the comments, click on the "Comments" link below.)
Featured Comments from:
Rob de Loe: "Superb find Mike. Thank you for sharing. The implications for society are stark and clear in this piece (and they're not good—we have a lot of things to figure out).
"What the article doesn't get to is the implications for 'photography' as an art form. On that front, I'm actually not too worried. Would you expect a painting to be a faithful representation of the real world as seen by the painter? Of course not. That's where photography as an art form is going (if it's not there already). Once people are comfortable with the idea that you can't believe anything you see in a still photo, we can all focus on whether or not the picture gets us to think, engages our emotions, makes us feel something, etc. (which is kind of the point for me). Of course where the wheels will come off the bus is situations where the photographer as artist claims that the picture is a faithful representation of the world, but in reality it isn't. We're there already as everyone hopefully knows based on recent famous 'gotcha' incidents. This will only get worse. Hopefully software will help us sort this out (but as the article points out, that's not guaranteed either)."
robert e: "I agree that different titles for web and print is both annoying and complicating, but I think I understand the reasoning, or at least the motivations to do so. While 'Afterimage' is a perfectly appropriate title in the context of an issue of The New Yorker magazine, it's not very effective in the context-stripping environment of web reading, where attention-deficient people like me peruse long lists of article titles, often from different sources, and often not bothering to click up even the descriptive blurb. On the other hand, 'In the Age of A.I., Is Seeing Still Believing?', while both attention-getting and descriptive, would clash with both the style and layout of the print mag, and not contribute anything to the way most people read it. (Might work as a subtitle.)
"Sure, many people probably do read the web mag as they would the print mag, but the idea, of course, is to try to leverage the openness and reach of the web to expand the steady, paying readership."
rodfrank: "I read an article a couple of years ago about Dr. Farid and his software that that will detect an altered photograph. At that time he extended an offer to analyze a couple of photos. My class at the local university sent two photos. One was unaltered and the other had very little alteration that we thought was well hidden. About a week later we received an email telling us not only which photo was altered but exactly where the alteration was done.We were impressed to say the least."
Seems to me we are in need of a new kind of verifiable secure image format for journalism, a file that cannot be manipulated without it being obvious. Maybe a kind of "bitpeg" like bitcoin, but easier to generate.
If the past is any guide we will first see this in porn, and then used for political ads and also for that poetry of the internet, memes.
Still, I think we'll work it out. History is written by the victors, but it is rewritten by survivors and the newly empowered, much to the annoyance of the previously victorious. No matter what you use to record the battle, you have to stay in the fight.
Posted by: John Krumm | Wednesday, 14 November 2018 at 10:14 AM
I sort of agree that digital is different from film and deserves a different name. How about Ephoto for electronic photos? It is still recording light but the effect of the light isn't 'written' on the sensor, it is necessary to electronically capture the light from the sensor to separate media in order to save it.
Posted by: James Bullard | Wednesday, 14 November 2018 at 10:21 AM
The Guardian also said: You thought fake news was bad? Deep fakes are where truth goes to die https://www.theguardian.com/technology/2018/nov/12/deep-fakes-fake-news-truth
In May, a video appeared on the internet of Donald Trump offering advice to the people of Belgium on the issue of climate change. “As you know, I had the balls to withdraw from the Paris climate agreement,” he said, looking directly into the camera, “and so should you.”
My favorite philosopher George Carlin said: “Think of how stupid the average person is, and realize half of them are stupider than that. What we need to worry about in the age of deep-fakes—how many mouthbreathers will be faked-out by deep-fakes?
Posted by: c.d.embrey | Wednesday, 14 November 2018 at 10:49 AM
Thanks for the pointer - haven't got there yet - but I wonder if the QUIRK in question is due to search engine optimisation for online articles, and eyecatching titles for printed pages?
( I've been pondering a NYT subscription for a while, but I have enough stuff that I don't do justice to already )
Posted by: Richard Tugwell | Wednesday, 14 November 2018 at 10:55 AM
The new medium?
Pixelography.
Those who practice it are Pixelographers.
Posted by: Daniel | Wednesday, 14 November 2018 at 10:56 AM
Just to help a little bit: This article is in the current issue dated November 12, 2018.
(the link didn't work for me)
Posted by: Kurt Kramer | Wednesday, 14 November 2018 at 11:13 AM
How's this for cranky old lady:
It's only photography and a photograph is it's a PRINT. (Thank you, David Remnick.)
The source doesn't matter- film, digital file, scanned film (which is a digital photograph, after all); all that matters is that it's an object.
All over California these last weeks, the one thing that so many families took with them in their desperate evacuations from a hellish fire is their collections of photographs, all of which are physical objects. Photos are those wonderful THINGS that you hold and look at and interact with. Photos are COOL!
Did I do OK with my level of cranky? ;^)
Posted by: Maggie Osterberg | Wednesday, 14 November 2018 at 11:51 AM
"Afterimage"... An interesting, relatively disconcerting article which brings forth the huge grayscale history of photographic "truth" vs. fiction. I entered the reading with a lifetime's worth of creative photo retouching experience and left with 15-minute bus ride to the future.
Posted by: Bob Gary | Wednesday, 14 November 2018 at 12:06 PM
Mike - dead link!
Posted by: simon | Wednesday, 14 November 2018 at 12:07 PM
Good article you linked to. In my career as a software developer, some things were possible and others were Hard. With a capital H. As in, can't be done. Removing a person from a photograph to show what was behind was such a problem. Computers can't do such things. Increasingly we're seeing that they can. Or rather, people can. The people who do the research and write the software.
Nowadays, in my part-time role as a museum demonstrator of historic computers I'm often asked by visitors what computers will do in the future. I always tell them - whatever you can imagine, they'll probably do.
Anthony
Posted by: Anthony Shaughnessy | Wednesday, 14 November 2018 at 12:34 PM
Fake smartphone photos-
https://www.washingtonpost.com/technology/2018/11/14/your-smartphone-photos-are-totally-fake-you-love-it/?utm_term=.aae0b06ff89b
Posted by: Herman | Wednesday, 14 November 2018 at 12:36 PM
Back in the 2002/2003 time-frame, I had transitioned to digital because it was so obvious how much more control it gave photographers, both with respect to being able to change ISO from frame to frame, but more importantly, control over parts of the tonal range rather than just the global tonal range (not including burning, doding and masking techniques here, which were highly artisanal and highly variable and therefore irrreproducible, statistically).
To say nothing of the fact, that for photojournalism, digital had become a "must-be" requirement for delivering photographs for deadline press.
I was taking some photography classes and participating on the Olympus OM mailing list at that time, and there was an incredible amount snobbery about film vs. digital back in those days. The Olympus shooters were saying that digital was fine for snapshots and tourists, but serious photographers shot film and would always shoot film because "digital will never match film" for image quality. I heard comments from my photo class members that "film is sacred" and was introduced at photography exhibits, as, "This is my friend, Stephen. Shoots digital", with an all too- obvious tone of disparagement. The tone and tenor of the discussions reminded me of the "art is not photography" debates back in the early 1900s and which led to the "pictorialist" movement to provide it with credibility.
Well, we all know how that turned out. Kodak and Hasselblad collapsed from corporate complacency and being caught out by the disruptive innovation of digital as described in HBS professor Clayton Christensen's in his seminal book on innovation, The Innovator's Dilemma. The dearly departed Michael Reichmann wrote articulately about this on LuLa over a decade ago.
It all comes down to this, and I used to teach this key principle to my Design for Six Sigma students. People get hung up all the time on current functional embodiments as being the "thing" under discussion, e.g. "that the word photography described what we now clumsily know as analog or optical/chemical photography".
This is quite simply not true. Functionality is functionality, y=f(x).
Functionality is technology and embodiment independent.
There are almost always different ways to produce the desired functional response. Often times, many different ways. This is why both a vacuum tube and a transistor, which are very different methodological embodiments, provide the exact same functional response, gain in an electrical circuit. Understanding this was also key to the development of Genrich Altschuler's "Teoriya Resheniya Izobretatelskikh Zadatch" (Theory of Inventive Principles), aka TRIZ.
You put provide an input modulated by control factors into a "system" and you get out of that system a functional response you care about.
When you look at "photography" from a systems engineering functional decomposition analysis perspective, there is no difference between a heliograph, daguerrotypes, platinum prints, silver halide paper prints, film, metal plate or a CMOS sensor.
At the end of the day, these different approaches to photography are all variants of ways to deliver a transfer function producing a functional response.
In this case, capturing light on a sensing medium to render an image.
Posted by: Stephen Scharf | Wednesday, 14 November 2018 at 01:04 PM
Well photography’s truthiness has been questionable virtually since its invention. The chemical era was no more honest than the digital era, just a bit harder to manipulate convincingly.
But, of course, this NY article goes far beyond photography and simple digital editing, into digitally synthesized imagery made possible by today’s stunningly powerful computing and graphic display technologies. (BTW, much, perhaps most, of the car ads you see are generated from a mix of live footage and 3D model graphics. Car photography is nearly kaput.)
—-
Side rant: Although I’ve subscribed to the NewYorker for some years but I become increasingly irritated by their general tendency to write to write. On and on and on....often pointlessly. Anyone else feel this way?
Posted by: Kenneth Tanaka | Wednesday, 14 November 2018 at 01:26 PM
The link to https://www.newyorker.com/magazine/2018/11/12/in-the-age-of-ai-is-seeing-still-believing is broken,
I think that the 50 or so year period where people assumed that photographs were true , that "the camera doesn't lie" are going to be seen as a strange anomaly born of industrial photofinishing for the consumer market where people knew that the photographic image was immutable.
On the other hand, lest anyone think for a moment that any of this is new, this is what press photography looked like in the 40s and 50s.
http://hughcrawford.com/memory-palace/weather/
Obvious to us but apparently not then.(yes I photographed a print in color that was intended to be photographed by an orthochromatic stat camera for a plate, so it is exagerated, but that's the point)
The question though is, are these photos less or more true before or after the manual editing?
Posted by: hugh crawford | Wednesday, 14 November 2018 at 01:47 PM
Dear Teach.......
Very good article, and much appreciated.
Calling interesting things to our attention is one of my favorite aspects of TOP. That includes the fact that the comments often add significantly to the post.
It is only Schoolmarm like if you tell people what they ought to think.
You go out of your way not to do that.
Re Digital Photography being different from the Photography we grew up with, -- I remember the hubbub well.
Digital Photography is "Photography Extended" and more than that it is 'Photography Continually extended' often is ways that none of us contemplated at the beginning of its digital makeover.
I don't think you are getting a new word for it though, there are too many issues. One of which is that for some people, there is NOT a lot of change, they practice it in much the same way as they always did. Yet for others , you are completely right it bears little resemblance to what we used to call Photography
It may be that we will have to wait, and like Pluto be 'defined' differently in retrospect.
What is interesting to me , whatever we call it, Photography has become multi-polar with what we would call 'Traditional Photography' being only one of them. People who have grown up with Facebook Et Al, see it differently -as a substitute for or extension of language, with the same speed of communication.
Other forms like Stochastic Photography , or Machine vision , or A.I. and others take 'Photography' off in directions we never dreamed of.
It also plays out on the manufacturing side, with ,for example, Apple defining Photography differently than "Traditional Camera companies" do.
As more people define photography in Apple like terms, fewer of them are likely to buy cameras from manufacturers with narrower definitions. It's already happening Apple sells more cameras every year, and Camera companies sell fewer. Everyone keeps predicting an equilibrium, or a halt in the decline of camera sales, but it hasn't happened-- even at a time of increased consumer spending.
You are probably right about needing a new word, but you can't do that without making lots of people feel like Pluto
Posted by: Michael Perini | Wednesday, 14 November 2018 at 02:11 PM
clumsily know as analog: You opened a can of worms but since you included 'clumsily' no harm no foul.
For me Analog is not a term that should be applied to FILM photography. My reason for this is simple. Both film and digital cameras have analog collectors of photons. The digital camera has a electronic sensor that is just a collector of photons. Only when you move those photons from the sensor with analog-to-digital logic do we now have digital information.
There are so many common parts to film and digital cameras that only with the processor in a digital camera can we differentiate the two technologies.
Posted by: John Krill | Wednesday, 14 November 2018 at 02:24 PM
Photography for AI is simply data for ingestion. The methodologies and heuristics that are programmed into the machines are intended to produce certain goals or outcomes-i.e. decipher license plate numbers. I think this has little to do with the human use of photography. Regarding authentication of a digital image, if (yes, if) digital generation is indistinguishable from capture from nature than the problem would be the need for some kind of identification system encoded and encrypted into its digital stream. In this way, I think the distinction between DI and photography has some merit. But that's just a technical issue. All of this misses the forest for the trees, which is about perception and meaning.
Regarding objectivity, what exactly does the author of the New Yorker piece believe this is? Does it even exist? (See Carlo Rovelli and his ideas about perception and time - the key ingredient of a photograph ignored in the New Yorker article.) At best, objectivity is a matter of degree; selection, framing, etc does not necessarily make a photograph more subjective than objective. To make such a claim is categorically absurd and a good example of faulty first-order thinking.
The irony is that, of course, we live in an age of image saturation and, I would wager, view images as authentic artifacts in a much more naive way than people did 50 years ago. Why? Because so much of our reality is mediated by screens - which we have to accept as authentic in order to make them do what we want. As a result we are more susceptible to view digital images uncritically. Visit any museum and observe people take cell phone shots of art instead of actually perceiving it with their eyes. The object doesn't become "real" until they've "captured" it with their phone. Black cat, meet kettle.
Photography, as practice, is a process of learning to see (which means being aware of one's blind spots so as to see more clearly). Anyone who has taken photos seriously for any length of time becomes aware how their old images ignored areas of the frame that had a greater impact on the overall image than they'd realized. They were blinded by what they wanted to see. This is an analogue for most things in life. AI is about as useful in this endeavor as a plastic spoon is when in the need of a deep ditch.
Posted by: David Comdico | Wednesday, 14 November 2018 at 02:55 PM
I can't get the link to work on the little iPad, but no matter: you have touched upon something very close to my own perspective about digital capture and image production.
I feel it to be so distant from analogue as to be something quite else that also just happens to result in an image.
My initial love for photography was all about the magic of the latent image, and transforming that from just a camera click into something that was solidly there in the form of the negative. (That security has been lost.) The next step was even more breathtaking as that picture slowly became visible beneath the fluid in the dish. It was something that never left me, however familiar the process inevitably became. There was always that sense of achievement, of having developed some special talent/style to and within one's work that other people couldn't ape. With digital, that's pretty much gone because today, all it takes is the patience to sit before the computer for as long as it takes to tweak a little here, tweak a little there, and if it doesn't work, just reverse your steps and have another go, costing nothing but time. If ever a monkey sitting at a typewriter could manage to produce Macbeth, that creature could learn to make digital photographs in ten percent of the time.
Basically, and this isn't the first telling, were my initial photographic experience to have been with a digital camera, I seriously doubt that I would have been inspired to continue learning. I found the adaptation from the one to the other to be rather laborious, and it took quite a lot of friendly Internet help before I picked up enough know-how to do what I want to do, which is simply what I wanted to do with analogue. I found film and wet darkrooms easy to understand, and today, without a lot of darkroom experience behind me, it would be very difficult to realise just how good a black/white photograph can look if you know what you are doing. I think this stems from the fact that with so much variation possible, with so many electronic tools to play with, the result is often confusion and the inability to know where to stop.
Less often really is, more.
Posted by: Rob Campbell | Wednesday, 14 November 2018 at 02:57 PM
Similarly, this article popped up in the Washington Post.
"Photography has never been just about capturing reality, but the latest phones are increasingly taking photos into uncharted territory."
https://www.washingtonpost.com/technology/2018/11/14/your-smartphone-photos-are-totally-fake-you-love-it/?utm_term=.0dbcb3af7038
Posted by: Dan Arango | Wednesday, 14 November 2018 at 02:57 PM
"any comparison of film vs. digital quickly devolved into a status dispute, and people on Team Digital were immediately and automatically prickly about imagined slights to their standing. They wanted the main word applied to their chosen tech." Hah, so funny! On Dpreview, the "photographers" (if that is what you call them when you are being generous) are still battling that war. In fact, many of them treat film-based imaging as if it is not real photography.
Posted by: Kodachromeguy | Wednesday, 14 November 2018 at 03:28 PM
Thanks for the link to 'The New Yorker' article. a most enlightening read.
Posted by: Mark Cotter | Wednesday, 14 November 2018 at 03:37 PM
The digital-is-not-photography argument raised heat because nobody likes what they do to be defined away by those who seem to not like it. That argument resembles strongly the photography-is-not-art wars of the deep and not so deep past.
As soon as the Photoshop operator starts drawing instead of projecting, then the photograph starts to become less like a photograph. But this is, of course, nothing new. What is the dominant image-making mechanism? Projecting the light from a scene onto a sensitized surface? Photography. We don't even have to argue about the meaning of words like indexical.
I'm with you on using italics for book titles. The font issue, too. But typesetting is not typesetting if hot lead isn't involved. We should call it digital lettering. Er...
Posted by: Rick Denney | Wednesday, 14 November 2018 at 04:57 PM
"At the very least, there's a short list of people I think everyone interested in our subject ought to know"
Ooo... can we see the whole list? As a post? With some short description? Pretty please?
Posted by: Yoshi Carroll | Wednesday, 14 November 2018 at 05:21 PM
I would go with digital imaging if I never printed but I use chemicals to print now so I'm sticking with photograph. :-)
Sharon
Posted by: Sharon | Wednesday, 14 November 2018 at 05:53 PM
"I am but an egg," to quote another (fictional) Mike. Always open to learning, even from someone a generation younger 'n me. I appreciate your recommendations (not assignments).
And I'm with you in annoyance at the New Yorker's non-matching titles.
Posted by: MikeR | Wednesday, 14 November 2018 at 06:40 PM
Fascinating article for our visual age. Will be interesting to see how society evolves to cope with this as the tools for video manipulation become increasingly approachable to the masses.
Posted by: Bill La Via | Wednesday, 14 November 2018 at 08:43 PM
Full disclosure: I stated on this blog some time ago that "there is nothing new to photograph." I stand by that statement. (Not an original thought: the recently deceased Hank Wessel made that statement when I attended San Francisco Art Institute in the mid 1970s. He was trying to get his students' minds out of their eyes, and I believe that claim then, just as I do 40 years later.)
This article in The New Yorker is clarifying. Towards the end, we read: "But, actually, from the very beginning photography was never objective," Efros continued. "... we've been fooling ourselves...it will turn out that there was this weird time when people just assumed that photography... was true."
From the Yale Alumni Magazine: Question - Do think it is possible for the camera to lie? Answer (by Walker Evans) - "It certainly is. It almost always does." (Walker Evans at Work; Harper & Row; 1982)
About a decade ago, I obtained my MA degree in Studio Art at California State University, Sacramento. One of my advisors was the photographer Roger Vail. The other professors in the graduate program were painters and sculptors, fine people. No photographers (one New Media professor, though). "The first thing you will have to do," said Vail, "is to 'unlearn' (the other professors) from everything they think they know about photography." Roger was oh so right.
No medium in the history of human creativity has come so preloaded with expectations and demands as the still photograph. From the outset, due to its literal description of surface, it was imbued with the faith and belief it would somehow reveal the "The Truth." What a fallacy. It does anything but, for which I am eternally grateful.
From the New Yorker article: "Before Photoshop, did everyone believe that images were real?" Zhous asked, in a wondering tone.
"Yes," Ginosar said. "That's how totalitarian regimes and propaganda worked."
I'd say, myself, that if there is in fact a paradigm shift from believing photographs show "The Truth" to one of realizing they do not, and never have, that is a good outcome. One we, as photographers, should hail and celebrate.
Posted by: Ernest Zarate | Thursday, 15 November 2018 at 02:12 AM
https://www.nytimes.com/2018/11/08/lens/charles-traub-taradiddle-photographs.html
Posted by: Ernest Zarate | Thursday, 15 November 2018 at 03:39 AM
According to the OED, the word "photography" from the Greek for "light" (photo) and "writing" or "delineation" (graphy) was coined by John Herschel in 1839 in a paper to the Royal Society. In that sense, I see no difference in applying it to images made from light passing through a lens onto a digital sensor - that is, my linguistic OCD antennae does not tingle. But I can see why the use of the word to digital images might be annoying to others (looking at you, Mike because the OED definition is "The process or art of production pictures by means of the chemical action on a sensitive film on a basis of paper, glass, metal, etc.; the business of producing and printing such pictures".
Posted by: Bear. | Thursday, 15 November 2018 at 04:15 AM
The different titles is to cater to search engine optimization. A short print title won’t work for Google, so you need a long descriptive title. Real pity, because great story titles are a joy to behold.
Posted by: John | Thursday, 15 November 2018 at 07:08 AM
Aside from the naming semantics:
Sez you: "D.I. is closer to figurative painting than it is to photography."
No. The everyday amateur digital Canikon gets closer to photography than the everyday film Canikon did.
Yes, the digital result can more easily be manipulated into something further from photography, but the average film result from 30 years ago was closer to figurative painting.
(A carefully crafted 8x10" contact print might be a counter-example.)
Posted by: Luke | Thursday, 15 November 2018 at 09:22 AM
Re back-formations:
We now have "outdoor cycling", as opposed to the more common pedaling on a stationary machine with a screen in your face to simulate cycling.
And I'm sure you've heard "reverse selfie", which involves aiming a camera AWAY from your face.
Posted by: Luke | Thursday, 15 November 2018 at 09:28 AM
What Ken Tanaka said. On both points.
And regarding his side note on the : I could not agree with you more, Ken.
The New Yorker, much like Apple, seems to have become incredibly self-absorbed, and have become entirely inwardly-looking, focusing only on how "cool they are".
Having been interviewed by a number of journalists about my contributions to the invention of PCR (which won the person who conceived it, my colleague Kary Mullus, the Nobel Prize in Chemistry in 1993), I learned very quickly, rather than being strictly objective gatherers and reporters of facts, many journalists start out with an agenda before they have talked to anyone, and then only do the interviews and research that back up the agenda they had decided on at the very beginning.
Posted by: Stephen Scharf | Thursday, 15 November 2018 at 12:43 PM
"Really, D.I. is closer to figurative painting than it is to photography. Sez me."
SERIOUSLY!? They are not anywhere near the same thing. The amount manual dexterity required alone puts them in a totally different universe. Sez me.
Also, "digital imaging" is only slightly better than "digital capture". Both phrases are cold and heartless. "Photography" is timeless.
Have a great Thanksgiving.
Posted by: Dennis | Thursday, 15 November 2018 at 05:28 PM