Before I begin, I would like to ask everyone a vaguely related question: How much post-processing is TOO much post-processing? To even attempt an answer for that, let’s look at what may be considered one of the most virginal forms of photography: Film.
While purists and transitionaries will indefinitely clutter internet forums with their heated debates of whether photographic film is truly an analog technique or not, one thing remains certain: changes recorded by silver halide crystals suspended within photographic emulsion during a split-second exposure creates the most tangible rendition of light possible at that moment.
The photographer (in the least involved part of the photographic process) simply holds the camera still while the sun, brandishing a metaphorical paintbrush, makes an imprint of the scene into the film not really any different than a cat leaves paw prints in wet cement. Much more than a digital rendering of 0’s and 1’s on an LCD screen, the strip of plastic you hold in your hand contains a ‘mold’ of visible reality.
But Does Film Truly Represent Reality?
As you can see from this developed 35mm Fujifilm Superia negative, it doesn’t. The sky isn’t red, the ocean isn’t purple and clouds aren’t black. Before we can ‘cast’ this slice of time into a glossy 6×4 or an 18 megapixel Facebook profile picture, a significant amount of post-processing must first be performed before the photo will resemble what we see with our eyes.
In the case of 35mm negative film, this entails vats of chemical cocktails (such as the C-41 process) and a complete color/tone inversion. For digital photography, it’s a series of electronic demosaicing algorithms, white balance adjustments and brightness/contrast/noise filters.
Luckily for all the aspiring photographers who don’t know what their bayer array does, this complex process has been streamlined into the very structure of programs like Adobe Photoshop & Lightroom. Automated digital image processing is so inherently fundamental to DSLR operating systems circa ‘14 that most people are not even aware that it happens.
As I stated in my previous article, A Quintessential Guide To The World In Infrared: Color Processing, *technically* there is no such thing as ‘color’ in infrared – only light being reflected and absorbed. ‘Color’ is a perception that results from the way our brain processes wavelengths of visible light. As a result, true infrared can only be measured in terms of tonality (darkness/brightness), which is essentially, but not exactly, black and white.
Why is it then (you may ponder), are my IR shots (as well as so many others) rendered with such vibrant (albeit slightly abnormal) color?
The Answer Lies In The Near-Infrared Spectrum…
I use a 720nm IR-filter and the visible spectrum extends until approximately 700nm, so a small amount of visible light will inherently ‘leak’ through the filter and into my final images. Because the closest humanly perceptible color to this particular IR-filter is red (≈ 620 – 740nm), the entire photo will take on a deep tint of reddish-burgundy. Properly adjusting the white balance to something ‘neutral’ shifts the red tint into more distinguished shades of blue and yellow (This tends to be the default color scheme of digital near-infrared photography). In this sense, as with any color-infrared photography, the resultant colors are false and do not represent anything more than the reflection and absorption of IR light.
FALSE COLOR?! Before anyone accuses me of being a manipulative Photoshop artist rather than photographer of the natural world, let me enlighten you to a few false color applications you’re probably already familiar with…
Thermal imaging uses a psychedelic gamut of false rainbow colors to represent minuscule changes in temperature that would otherwise be seen as near-indistinguishable gray-scale values:
Aerial surveillance film (Such as Kodak Aerochrome) and satellite (LANDSAT) imagery employ vibrant, but false color schemes to distinguish geological features and assess aspects such as vegetative health:
The vast majority of the universe we know through astrophotography is usually rendered in false color because the vast majority of the universe is optically invisible to us:
Compared to the same nebula as seen in visible light…
False color is simply a means to bring tone, color and subsequently, a discernible form to what we cannot perceive… and digital infrared photography is no exception.
Looking at this white-balanced 720nm infrared photo I took of a busker with his bass guitar at the Old Fourth Ward Arts Festival in Atlanta, the false color scheme is obvious: The sky (a known absorber of IR light) is yellow/orange, while vegetation and clothing (known reflectors of IR light) appear blue.
But what if I want my sky to be blue? What if I want to use colors to articulate an atmosphere or idea? What if I want complete control over every color output in the photo?
It brings me great pleasure to divulge some info, currently in circulation on the streets, that speaks of a solution to this problem nearly as old as photography itself: Channel Swapping – i.e. mapping what would normally be perceived as red to, say, blue for example. All false color rendering is built upon the premise of channel swapping.
Loved around the world for its ability to bring a sense of ‘normalcy’ to digital infrared photography (in part, by rendering the sky blue once again), I’ll demonstrate one of the most basic color remapping maneuvers:
The Red/Blue Channel Swap
Open your white-balanced infrared image in Photoshop then navigate to ‘Image’ > ‘Adjustments’ > ‘Channel Mixer’ (below):
Without making any adjustments yet, we can observe that the red output channel consists of 100% red (below):
…and the blue output channel consists of 100% blue (below):
… as they both should.
The idea is now to ‘swap’ the color source of both corresponding output channels. In the red output channel, set the red source to 0% and the blue source to 100% (below):
You should observe the colors of the image begin to change…
Likewise in the blue output channel, we must set the blue source to 0% and the red source to 100% (below):
Click ‘Ok’ and the channel swap is complete.
What was once a sepia-toned, apocalyptic-seeming sky has been replaced by a remarkably familiar shade of deep blue… but something is not right. The darkness of the sky resembles late evening even though this picture was taken at noon? Ah, it’s because the atmosphere of the Earth absorbs much IR radiation, causing it to appear dark!
The complimentary pale yellow/orange colored vegetation looks more like a layer of snow (thanks to reflected IR light) than the actual surface of the leaves. Either way, it firmly cements this surreal slice of time as a scene in an alien world.
Looking at things from this perspective, digital infrared photography is no less credible as an accurate depiction of reality as any authentic visible light photograph – film or digital. It’s not that an unnaturally large amount of post processing is required to obtain a ‘normal’ looking IR image, it’s just a process that must be done manually. If anything, this leaves the photographer with virtually unlimited control over the outcome of the final image.
And as long as you keep your artistic license on you at all times, no laws are being broken.