Anyone who's done much image post-processing knows it's true: what you see on your screen doesn't always tell you much about how something's going to look in print. If you've ever fine-tuned a photo in Photoshop only to be disappointed when the printed results didn't even come close to the power and punch of the shot on your screen, the importance of having your monitor's output match your printer's is probably evident. While full-time pros recognize the need to be able to accurately control color from one end of their workflow (the camera) to the other (the print) and through all the stops in between (the computer), this idea has only recently come into its own with hobbyist photographers, fueled to some degree by the availability of lower cost color management tools.
(view large image)
If you've been scared off of color management in the past by the mystique that surrounds it, fear not: with the current crop of hardware and software color management tools, especially, calibrating and profiling your workflow is not only manageable but even downright easy, even for profiling newbies. In this first installment of a two-part DCR Workshop series, we'll move through calibrating and profiling your display – the process of standardizing your monitor's output that forms the foundation of making what you see on screen what you get when you print your images: why is it important, how does it work, and what do you need to make it happen? In an overview that's part tutorial, part product review, we'll also demo the display profiling process using a particular color management tool – in this case, Datacolor's Spyder3 Elite system.
WHY CALIBRATE AND PROFILE?
Admittedly, for casual photographers carefully controlled color may not be the most photographically effective use of time and money. For basic calibration solutions, if you're working on a Macintosh try the built-in calibration utility in OS X's display controller; on a PC, Adobe Gamma performs a similar function, though you have to have Photoshop installed to use it (and don't forget to disable/remove it if you move up to a third-party calibration/profiling system). For many users, these basic correctives may well be enough to get your display close enough to what you're getting from your preferred photo output method – whether it's your home printer or a photo lab.
What happens, though, when "close enough" isn't close enough? Getting the perfect look on your screen after hours of editing only to spend countless more hours trying to get a print that aligns with what you're seeing is one of the most frustrating aspects of digital photography. Often, the differences that we're talking about at this level are subtle (they certainly were in our testing), but if they aren't usually the difference between a bad print and a good one, they can easily constitute the minutia that separates good prints from great ones. Given the wide variance we experienced in comparing color on one monitor to the next, it's probably safe to say that a fair number of DSLR users have spent more time and money on camera-side equipment upgrades that will make a less apparent impact on the final printed image than a calibrated display and profiled print workflow.
While advanced display calibration isn't a "magic bullet" for color and tonal range issues – there's output device profiling to consider, for starters, and even the best calibration systems can't fully account for hardware and image handling differences – it's an important first step in making what you're seeing when you edit photos a reproducible situation across a range of devices. With display calibration and profiling, what you're ultimately doing is bringing what you're seeing on a specific monitor – in terms of color, brightness, and contrast – into line with a widely accepted standard for image reproduction, and in this sense, calibration is an investment in peace of mind. With a calibrated and profiled display, you have some assurance that what you see on-screen is, up to a pretty tightly defined standard, what you should expect to get in print, taking a lot of the guesswork about how something will print up out of the image workflow.
I'll repeat that for novice shooters looking to get decent, consistent snapshot prints, a color management system like the one described in the sections that follow may well be overkill. But if you're running up against colors that just won't cooperate from one side of your workflow to the other, keep reading...
CALIBRATION AND PROFILING: NUTS AND BOLTS
Once you're sold on the idea of display calibration and profiling, it's time to get down to the actual process of calibrating a display. Anyone who has attempted to understand calibration and profiling before is probably aware that this is where things get technical (and, in many cases, more than a little obtuse), and those of us without advanced degrees in physics start getting scared. In calibrating and profiling a display, however, there are only a few basic concepts at work, and before we jump into the calibration process itself, a little demystification is in order.
Calibration and Profiling: What's the Difference?
First, a quick note on the fundamental terms themselves, as the words "calibration" and "profiling" are not interchangeable. At its most basic, "calibration" has to do with setting a monitor's brightness and contrast to accepted standards. A "profile" stores the information needed to bring the display into calibration, along with information used to correct how the monitor displays color. To use the technical terms, calibration (the first step) focuses primarily on adjusting gamma, white point, and luminance, while profiling (which happens after the display is calibrated) stores this calibration information and also adjusts the monitor's gamut. If these terms mean nothing to you, don't worry: we'll move through them step by step.
Correcting a display's "gamma" is probably the most talked about, and least understood, part of the calibration process. In order to ensure that what you see is what you get, it's also potentially the most important. In essence, "gamma" refers to the contrast curve of your monitor; it doesn't effect the maximum values (pure white and pure black), but rather, changes the way the tones between these two extremes relate to each other. If you've ever played with the contrast of an image, you can probably visualize this basic concept:
Normal Contrast (view large image)
Increased Contrast (view large image)
As I increase the contrast on this image (I've used a black and white to make the tonal changes easier to see), the very lightest and very darkest areas of the image are still just as they were before. What has changed, however, are the gray values in the middle of the range, with the darker grays "moving" closer to pure black, and the lighter grays "moving" closer to pure white. To oversimplify somewhat, your display's gamma refers to how much of this kind of mid-tone shifting contrast it applies to what's being displayed.
LUTs and Profiles
The final displayed gamma is the result of both the monitor's contrast tendencies and information supplied to the display by the graphics card through a system known as the look-up table, or LUT. Now that you know this acronym, you can probably let it pass immediately from your mind – unless you have an older graphics card or you get into much more advanced profiling, the user and the LUT have no direct interaction, and each need only trust that the other exists and is doing its job and leave it at that. In the interest of full disclosure, though, the final word on this topic is that in calibrating a monitor, you are essentially supplying data through the means of a utility or piece of software to the LUT, which compensates for the display's contrast tendencies to arrive at the chosen contrast level – the chosen gamma.
Along with color rendering information, this calibration data forms the "profile" that the computer uses to bring the display into conformity with the standards you select. More on this a little later on.
Gamma and White Point, Mathematically
Where an actual display is concerned, gamma isn't just a general concept but a specific, expressed number value – normally between 1.0 and about 2.5 (though it can theoretically be higher). These numbers come from the mathematical representation of a particular contrast curve – if you've ever used the "Curves" function in Photoshop or similar programs to manipulate image contrast, this idea will be familiar. (For tech heads who insist on knowing more, gamma, or γ, is the exponent in an equation f(x)=xγ relating display intensity, x, and luminance, f(x). In terms of actually calibrating a monitor, this specific piece of knowledge is completely superfluous as well.)
Obviously, then, a purely linear curve (one that applies, for purposes of the contrast analogy, no contrast adjustment to the display) has a value of 1.0, with values greater than 1.0 increasing the amount of contrast, in effect. With many calibration utilities, you can calibrate your gamma to whatever value you'd like, but these days a gamma of 2.2 is the accepted standard for most uses. Thus, if you're using one of these color calibration utility that allows you to set gamma – whether a third-party system, an Adobe or Apple utility, or your graphics card's control panel – use 2.2 as a starting point.
Similarly, monitor calibration must also take the device's displayed "white point" into account. Anyone who's familiar with digicam white balance can probably foresee where this related concept is heading: white point is, for all intents and purposes, the color temperature of the image. While the choice of white point (which is, like white balance, defined as a temperature value in degrees Kelvin) for your display is somewhat arbitrary, the accepted standard is 6500K – essentially, the color temperature of mid-day daylight.
Why Does it Matter?
So why these standards for gamma and white point, and why do they matter? Basically, the human eye doesn't see the world in a linear way, and a contrast curve with a gamma of 2.2 provides a nice match for the way our eyes see: calibrating a monitor to a gamma of 2.2 helps make the display's color space better align with the way we see the world. It's all more complicated than that in truth, and there are other accepted gamma standards for different uses, but if you're a Windows user, especially, with a typical monitor working in the sRGB color space – the default setting for most cameras, if they have a setting at all – 2.2 is the most commonly accepted choice.
Likewise, calibrating the white point on your monitor to 6500K gives you, in most basic terms, a picture of what your printed images should look like under daylight-colored lighting (though as we'll see, calibration utilities sometimes recommend a different white point to account for ambient light conditions in your work environment).
Ultimately, a corrected gamma and white point, along with appropriately set brightness (which establishes the pure white and pure black points in which the contrast curve operates), form the basis of monitor calibration, allowing you to distinguish subtle changes in shading.
With a calibrated monitor, you should be able to view a series of stepped "gray percentage" blocks like the ones above and make out the distinctions between each segment. If you can't, your display isn't showing you everything that's potentially going on within your images.
Gamut and Profiling
As noted, the calibration data used to set your monitor's displayed gamma to 2.2 and white point to 6500K is a large part of the profile that your chosen calibration software or utility adds to your machine. The profile (technically known as an "ICC profile") is nothing more than a file that stores data used by your graphics card to adjust your display output to conform to the standards used in the calibration process. In short, the profile is used by your graphics card to tell the display what to do to achieve correct gamma and white point. It's also crucial to the other piece of the profiling equation: gamut.
Not to be confused with gamma, "gamut" refers to the range of colors your monitor is capable of displaying. In addition to correcting gamma and white point to specified standards, a color management system also analyzes the monitor's output gamut and compares the colors it displays with known values. If your monitor oversaturates certain greens, for instance, a profile can zero in on these specific color values and apply a corrective via your graphics card.
Building a color profile is where visual calibration becomes too subjective to be of much use. While you can usually get gamma calibrated to an acceptable standard, for dealing with your display's color reproduction issues, a third-party system using a colorimeter is really almost a must.
CHOOSING AND USING COLOR MANAGEMENT TOOLS
Thankfully, understanding the intricacies of what color management tools are doing is much more difficult than actually using one. When it comes to consumer-level hardware and software packages for calibrating and profiling your display, it's really a two-horse race. For this tutorial and test, we went with Datacolor's Spyder3 Elite system, though the somewhat more dominant X-Rite (to use another analogy, the Canon to Datacolor's Nikon) has several interesting tools out there with similar functionality.
For casual users, the Spyder3 Elite system (at around $270) comes off as a bit pricey, though advanced photographers will appreciate the more open features set it provides over previous and current step-down Spyder models. Given the level of customizability and the decent amount of power it offers, it's a relative bargain compared to a commercial spectro system. Still, for users looking to wade into the monitor calibration pool rather than dive in head-first, both X-Rite and Datacolor offer more basic monitor calibration and profiling systems in the $150 range.
The walk-through that follows, then, deals specifically with the Spyder3 Elite system, though the process is much the same whatever hardware and software package you choose.
The Tools for the Job
Whatever company you go with and whichever system you choose, if you're shopping in the consumer and entry-pro level markets that the devices mentioned previously cover, what you're buying is essentially a two-part apparatus. The primary (really the only) piece of hardware needed is a colorimeter – a small USB device used to measure and evaluate the output of your display and communicate this information back to the computer.
(view large image)
The latest, greatest technology for consumer colorimeters is a built-in ambient light measurement system (that little "eye" on the front of the device), though the business end of the colorimeter is the sensor found on the flipside.
(view large image)
In Datacolor's lineup, at least, it's the software package – the second part of our two-part system – that defines the setup (all of the consumer-level Spyder systems use the same colorimeter). As noted, we gave the Spyder3 Elite system a run, though Datacolor's Spyder3Pro software does much of the same thing without a few of the custom functions and bells and whistles.
Installing the Software
The entire process of installing Datacolor's software, connecting the Spyder, answering some questions, and letting the device build a profile took less than ten minutes, start to finish. If you're feeling impatient, a quick video walk-through shows how it's done:
To get things rolling, we dropped the Spyder3 Elite CD into the drive (in the case of the photos and video, on my workhorse Dell business notebook) and clicked "Install" from the pop-up menu.
(view large image)
On a PC, the Spyder3 utility loads a standard installer screen and goes to work. Everything was finished in less than three minutes, and with no restart required I plugged my colorimeter in via its USB connection was ready to calibrate. (Just for the sake of comparison, we also tried out the installation on four other machines – desktops and notebooks – including an iBook G4, with similar results and no installation glitches all around; if anything, the interface is a little cleaner, and the software runs a little smoother in a Macintosh environment.)
Launching the Spyder3 Elite utility for the first time, the system immediately detects an unprofiled display and automatically loads the "New Monitor" window.
(view large image)
Select the checkbox for "Calibrate this display," click "Next," and you're off and running.
Screen Settings and Options
At this point, the user is called on to provide some information about the display being calibrated/profiled, and to make some baseline hardware adjustments to get the monitor ready for the software's process. The Spyder's first question should be obvious: what kind of display are we dealing with?
(view large image)
In reading down the list of options, you'll note that the Spyder3 Elite system even allows calibration and profiling of projectors, ensuring color control for slideshows and presentations. While most casual users will likely be indifferent to this feature, for pros and other serious shooters looking to show off their work in a larger setting, it's a neat addition to the package. As interested as I was to see how the process for a projector might actually work – this is where the tripod socket on the bottom of the Spyder's stand apparently comes in handy – I selected "Laptop" as my display type (note that it's separate from "LCD," for reasons that will become obvious momentarily).
The next order of business involves specifying the level of hardware-side control your particular display provides.
(view large image)
Depending on the display type selected, the list of options is tailored to cover the common range of adjustments (a second screen covers the RGB and Kelvin sliders found on many newer LCD displays, for instance). In this case, my tester laptop has only backlight adjustment.
Once you've selected the hardware-side adjustments that can be made, the software walks you through what to do with them. In my case, the backlight control was used to visually set the white luminance, based on a quick visual analysis of a series of stepped gray blocks like the ones mentioned earlier.
(view large image)
For displays with more controls, Spyder3 Elite may prompt you to restore some to their default settings, or to use others to optimize your display output. Whatever the specifics, however, the software's instructions are, in most cases, reasonably clear and easy to follow, even for users unfamiliar with the technical aspects of calibration.
At this point, Spyder3 Elite goes ahead and fills in the rest of the details based on its baseline configuration.
Notice that, as expected, the Spyder3 Elite calibrates to a white point of 6500K and a gamma of 2.2 by default. For more advanced users needing to customize their calibration, however, adjustments to all of these values are just a click away under the "Expert Console."
(view large image)
As we'll see shortly, the Spyder3 system also sometimes suggests modifications to these default settings based on hardware limitations or ambient light conditions as measured during its analysis.
Ambient Light Analysis
With most user variables supplied, the process is now largely back in the Spyder3 Elite's automated hands. Though not all calibration tools are so equipped, our test setup is able to make adjustments based on ambient light measurements, and thus before analyzing the display itself, the Spyder prompts the user to setup the colorimeter to take an ambient light measurement.
(view large image)
As before, with a visual guide on the screen showing you exactly what to do, taking the measurement isn't difficult at all: set the colorimeter in its stand and near the computer, click "Next," and the device handles the rest.
In my specific case, the ambient light levels during testing were "Very Low" according to the Spyder (an ideal situation, given that too much ambient light can alter how a screen appears to the eye). For this kind of lighting, the Spyder software suggests that a 5000K white point and a lower white luminance level than the default setting might be appropriate.
(view large image)
Because I regularly use my notebook under extremely high ambient light, I'll opt to keep my settings (which are optimized for bright ambient conditions), but in general, if your software and hardware support ambient light analysis and you tend to use your computer under consistent lighting conditions, it's best to accept the suggested settings. Also, whether or not your calibration tools measure ambient light, it's recommended that you do what you can to reduce overly bright external light sources for the reasons stated above. In the case of the Spyder3 Elite, the system will warn you repeatedly if your ambient light levels are too high.
Building a Profile
For the final phase of the calibration and profiling process, the colorimeter must actually come into contact with your display's surface in order to measure the monitor's output and make appropriate adjustments. In our case, the Spyder3 software shows exactly where to position the device.
(view large image)
Attaching the colorimeter to the screen is usually done either via a suction cup (better on CRTs than LCDs) or by hanging the colorimeter with its cable draped over the back of the display (many colorimeters have a sliding counterweight on the cable lead to facilitate this).
(view large image)
Theoretically, you could just as easily hold the device in place, though the process does take several minutes and the colorimeter needs to stay reasonably still during this time.
With the colorimeter in place, click "Continue" and the software begins, placing a series of solid color fields in front of the colorimeter for the device to "read" and evaluate. Roughly four minutes later, our machine wrapped up and prompted me to give the new profile a name.
(view large image)
With a profile created, Spyder3 Elite shows users the before-and-after results of calibration and profiling right in the utility interface. Several test images of different types (saturated, black and white, low-key, high-key, etc.) can be called up on the screen; click "Switch" and the profile is alternately turned on and off, allowing users to see before-and-after images in real time and evaluate the difference for themselves.
(view large image)
A display profile has now been built, and the utility can be closed out. That's really all there is to it.
EVALUATION AND RESULTS
With a display profile built, you're ready to get to work, and in most cases your involvement with the calibration tool will be limited until you need to calibrate again (Datacolor allows you to set a pop-up recalibration reminder and recommends that you recalibrate your display once a month or thereabouts, as display output tends to shift over time). The software automatically stores the profile such that the computer loads it on startup without being prompted.
Glitches and Bugs
At least that's the ideal setup, and in most cases it seemed to work alright for us. This is where making sure that all other calibration/profiling utilities are removed or disabled becomes critical, though: we had one test machine in our group of five (a PC laptop with multiple displays connected) that didn't want to automatically load its profiles on startup, though I'm not entirely convinced that this issue didn't relate to a phantom version of Adobe Gamma that we kept uninstalling but never seemed to actually go away. Chalk it up to a unique system problem, since it was isolated to one machine of five.
It should also be noted that the software was a little screwy on PCs, in particular, with multiple displays, only detecting the display that the software window was launched onto on the first load. To find the second display, I had to calibrate the first monitor, close the software, and relaunch it on the other display. Strange, but apparently true, as the process proved to be the same on two different machines with this setup. In a similar vein, be aware that a notebook that uses multiple displays only part of the time (as in a docking arrangement, where you use the notebook plus an external display at your desk, but also use the notebook stand-alone at home or on the road) can occasionally give Spyder3 Elite fits, causing it to clear the profile for the notebook display.
The Visual Results
In testing the system on several machines, the results, while not always dramatic, were often revealing. For instance, as I had long suspected based on comparisons with my other machines, the laptop shown in the walk-through tends to push a broad swath of the highlight range, making areas of an image look blown out that, in truth, contain plenty of tonal information. Similarly, the laptop's default profile tends a bit cool, though I honestly hadn't even noticed until I side-by-sided the new version with its original profile.
If you work under bright ambient light (most overhead fluorescents in office environments fall into this category), you may find that calibrated color on your screen looks a bit dull at first, with whites that look closer to gray to unaccustomed eyes. The Spyder3's ambient light measurement function strongly disliked the light in our office, and wasn't able to sufficiently adjust the white luminance to compensate. Similarly, it's widely known that the apparent contrast and brightness of LCDs, and especially laptop displays, shifts dramatically based on viewing angle. Again, ambient light and working conditions can be as important to accurate color reproduction as the display setup itself, and doing what you can to create a consistent work environment for photo editing will make the output results more consistent.
The Spyder's Studio Match function does a decent job of bringing multiple displays to parity, though differences in hardware often make exact matches difficult to achieve – the more different the hardware (trying to match a new LCD and an aging CRT, for instance), the less you should expect the displays to match exactly. In some extreme cases, you'll have to decide what match points are most important to you and calibrate with this in mind; thankfully, the Spyder's Expert Console is generally up to the task of making these kinds of one-off adjustments, and all but the most exacting users will do just fine in trusting the system to make the match as it thinks best. Even without running Studio Match, we found the Spyder's profiles to yield extremely similar results across a range of hardware – producing a slightly warm profile compared to other systems I've used, but showing few variations in color/hue and no apparent discrepancies in tonal range across systems.
Long story short, the calibration and profile tool does exactly what it claims to do, providing consistent color, contrast, and brightness (and, in turn, consistent image appearance) from one computer to another.
Even in limiting the discussion (for now) to display calibration alone, this overview is really just the tip of the iceberg. There are simply too many hardware- and software-specific differences in implementing consistent color management to even begin to touch on in a single review and tutorial. We'll talk a bit more about device-specific ICC profiling in Part II of this series, but if there are specific questions about calibration and profiling as it relates to a certain setup or piece of software, or about controlling workspace in programs like Photoshop to make the most of a profiled display, we're glad to continue the discussion in the forums.
Larger than this, however, I hope that this walk-through gets at the idea that while the concepts may be complex, using color management tools to calibrate and profile your display really isn't hard and shouldn't be intimidating. With a growing number of options on the market and prices for these systems coming down into very reasonable territory, if you've been struggling to get your images under control, it's never been easier or cheaper to manage color in your workflow.
Stay tuned for Color Management, Part II, in which we'll bring things full-circle by profiling our printer output and do some analysis to see how much of a difference it ultimately makes.
more than 100 focused websites providing quick access to a deep store of
news, advice and analysis about the technologies, products and processes crucial
to the jobs of IT pros.
All Rights Reserved, Copyright 2000 - 2014, TechTarget | Read our Privacy Statement