On December 18, 2017, Google announced NIMA, a project that the internet giant has been working on for quite some time now. NIMA stands for neural image assessment. So what is this NIMA? There is a lot of buzz about it, so let me try to explain to the best of my abilities.
NIMA, in a nutshell, is an artificial intelligence that purports to be able to look at images and judge good from bad. The technology uses what is known as convolutional neural networks, which are networks that are trained with data that has been processed and labeled by a human hand, essentially learning the preferences of the average human. In the case of NIMA, the software has been trained to recognize objects and look at images, judging good images from the bad ones.
According to Google, in the future, there will be lots of ways to put NIMA to work. One proposed way to use this tool is to optimize photo editing — intelligent photo editing, as it is called, wherein the user inputs the photo and NIMA decides what editing should be done. Along with this are other proposed optimization tools, things that would reduce or eliminate what NIMA perceives to be errors in images or tools that upgrade the quality of the image to increase the engagement of the viewers who are looking at it.
But the main thrust of NIMA seems to be that this software is intended to be used to judge images. To sort through and filter large quantities of images, essentially burying those it deems to be poor quality while placing the images it perceives as top quality near the top of the search result. Some of this is based on technical image quality — things like focus, composition, colors, exposure and other important metrics. However, this system even goes so far as to “measure beauty and emotion” within images. The images are all judged on a scale of one to 10, with the top scoring images being the ones that humans will believe to be most beautiful.
Now, it is important to note that NIMA is not yet live. In other words, it isn’t currently used on any devices or on Google’s search engine. However, researchers are hopeful that someday soon, post-processing software, cameras, and even browsers will all have NIMA’s capability to sift through photos and essentially tell the user which images are good and which images are bad.
The Problem with NIMA
Doesn’t that sound handy? A tool, right on your device, that goes through the hundreds of shots you took on a given day and picks out the ones that are worth another look. Or the ability to do a Google image search for “beautiful landscapes” and have NIMA show you only the ones that are most artistic. On the surface, that does sound nice — a time-saving tool that cuts down on some of the less rewarding tasks involved in creating and viewing rewarding art.
Such a thing has the potential to change art as we know it. And not for the better.
Why? Think about this: As artists, we are taught, it is forced into our minds, with good reason, that there is no such thing as “good” art and “bad” art. Art is art and if you don’t like a particular piece, that only means that the piece wasn’t to your tastes. Someone else may love it. Someone else may dislike the things you like.
I’ll be quite frank about this: There are just so many reasons why this is bad news for the art world. Think about the way sometimes artists make use of technical flaws for artistic purposes. Perhaps you overexposed to get an ethereal quality to your image or you deliberately underexposed to add a sense of mystery or dread. Maybe you deliberately added noise to an image to give it that more old-fashioned look or perhaps you purposefully gave an image a hazy, soft-focus look that NIMA would consider blurry. All of these images have the potential to be filtered out based on flaws that were never flaws, to begin with, because they were a part of the artists’ creative vision. Such a thing, in essence, could silence artists who make use of such tactics, who don’t always produce the most technically correct image in favor of producing something that is meaningful, something that was in line with the vision the artist had when he or she set out to create the photo.
And then there is this business about measuring beauty and emotion. I’d like to cite Robert Frost’s poem, “The Road Not Taken” as an example of why it is impossible to measure beauty and emotion. This poem is one that, if you ask three different people what it means, you will receive three different answers. There are interpretations that are deemed to be the correct interpretations and then there is the most famous interpretation, which is that it is a poem about individualism, often considered to be the wrong interpretation. Whether or not any of these interpretations are correct or not, that’s beside the point. The point is, people, look at this poem, they read it, and each takes something a little bit different from it.
So how is it possible to look at an image and judge, definitively, whether that image is beautiful and emotional? It isn’t possible. Everyone is going to take something different away from the images that they view. Your standard of beauty and emotion will not be the same as mine.
Google does admit that NIMA’s parameters are based on what a majority of people would enjoy. Doesn’t that sound rather dictatorial? To me, it sounds a lot like we are forgetting the whole concept of “beauty is in the eye of the beholder” — and that concept is the real essence of art. Art is a democratic thing, something where each individual gets to decide what he or she likes.
And that brings me to my final point. What happens to the viewers, themselves? NIMA (or software like it) takes away the ability of the viewer to pick and choose what he or she likes. NIMA stands to simply tell you what is good and what isn’t. Essentially, it sets forth a definition of art. It tells the viewer what is art and what isn’t. This picture is good and that picture is bad, all because an algorithm determined it to be so. Here we stand, as photographers and as artists, having been told our entire lives that art cannot be defined. With the advent of software like NIMA, our work will potentially be exposed to an audience who has been given a definition of what makes art good or bad. An audience that has possibly been led to believe, by tools such as NIMA, that there is such a thing as good and bad art.
That, to me, is the real danger. With artificial intelligence telling us which images are worthy and which are not, photographers could be forced into producing cookie cutter images simply to satisfy search engine algorithms while audiences are forced to enjoy those cookie cutter images because those are the only images the search engine algorithms show them. NIMA and other tools like it stand to chip away not only at the artist’s creativity but at the viewer’s free thought.