Color in Color Images

Photography Jobs Online

Learn Digital Photography Now

Get Instant Access

To most people, adjusting the color in an image means setting the relative strengths of its red, green, and blue channels so that it "looks right." This definition works reasonably well for familiar daytime objects illuminated by sunlight; everyone knows how everyday things should look. In the world of astronomical imaging, however, where the human eye cannot perceive the colors of faint celestial objects, we must either base our notions of them on other images, or look for objective ways to balance the color channels in images.

21.1.1 White Balance

Because white balance is crucial in making acceptable images, most digital cameras come with built-in routines to handle it. If you set the white balance control to the type of scene illumination (direct sunlight, tungsten, fluorescent, cloudy, shady), you'll get images with reasonably accurate color balance. The manufacturer has measured typical scenes in these types of light, and stored matrix coefficients in the camera's firmware that portray a 100% reflective white object photographed under these illuminations which will appear white. In sRGB color space, "white" means that the red, green, and blue color channels will each equal 255 ADUs.

Use the Camera White Balance. Your digital camera's "sunlight" or "direct sunlight" setting should yield accurate color balance for almost all astronomical images. The camera will select a set of matrix coefficients that will give a sunlike light source equal values of red, green, and blue.

Use the Peak Values in the Image. A fairly reliable way of getting a good-looking image is to measure the peak pixel value in each color channel, and then scale the color channels so that the peak value in each will be 255 in the finished

Figure 21.12 Case study: Luminance-channel processing opens color images to a wide variety of powerful image-processing functions. On left is the original image of the Orion Nebula, and on the right is same image enhanced with wavelet spatial filtering to sharpen and heighten nebular details.

image. This method works quite well for terrestrial scenes where it is reasonable to expect that the brightest object should look white. Since the orangish cast of tungsten and greenish cast of fluorescent appear in the image of a lightest-color object in the scene, this method forces it to appear white.

Use Histogram Percentiles to Set Black and White. In astronomical images, the peak pixels values are usually due to one or two bright stars, and the lowest pixel values are the colors of the sky background. Rather than setting the white point to the peak values, in astronomical images it is best to assume that the average color of several dozen or several hundred stars is close to white. To accomplish this, create a histogram for each color channel and set the white point to 99.5th percentile in each color channel. Since most people want the sky background to be a very dark shade of gray, sampling a few hundred spots on a deep-sky image and finding their median value is a reliable way to determine the brightness of the sky background. The red, green, and blue color channels can each be stretched between the black and white points to give a black sky and an average star color of white.

• Tip: AIP4Win's Color Image Tool includes a tab with functions that can automatically measure the histogram of a color image to find

Figure 21.13 Case study: In HSL color space, you can enhance color with powerful algorithms. The left image has been sharpened from the original by 20 iterations of the Lucy-Richardson deconvolution algorithm. In the right, image luminance has been gammalog brightness scaled.

the best value in each color channel for the sky background black point and white point.

21.1 Luminance Enhancement of Color Images

Color balance is just the beginning for color image processing. In color images from digital cameras, the luminance channel always has a higher signal-to-noise ratio than chrominance channels do—enabling you to apply powerful image processing routines to the luminance channel. And because changes to the luminance channel don't alter the color balance, you can experiment freely.

Brightness Scaling. Important structures of galaxies and nebulae often have pixel values close to that of the sky background. Revealing these structures using simple brightness stretches usually ends up blowing out the bright parts of the galaxy or nebula. Brightness scaling offers options, such as the gammalog function, that simultaneously stretch low pixel values and compress high ones. This enables you to brighten the faint outer structures without losing the bright central arms and nucleus of your favorite galaxy.

Histogram Shaping. Histogram shaping is capable of converting an image with a few bright stars into one filled with swirling wreathes of interstellar gas. It forces the luminance data of an image into a histogram chosen ahead of time for superior image display properties. One of the most effective histogram shapes is exponential, a form that allows a steadily declining number of pixels into higher and higher pixel values—with the result that details from the very faintest to the very brightest appear in a single image.

Unsharp Masking. Unsharp masking is the classic method for revealing details in lunar and planetary images. However, if you split an image into RGB color channels, apply unsharp masking each in turn, and then combine them into a color image, color artifacts often appear around edges and bright features. By enhancing the luminance channel without touching the underlying chrominance channels, fine details appear clearly without introducing color artifacts.

Wavelet Spatial Filtering. Wavelet filtering has been called "unsharp masking on steroids"—but wavelets are legal! Wavelet filtering gives you the ability to selectively enhance the contrast of image details at every spatial scale, from tiny features a single pixel wide to features spanning hundreds of pixels. Figure 21.12 shows wavelet filtering on our standard Orion Nebula test image.

Deconvolution. Amateur astronomers use deconvolution to restore star images in images shot on nights with poor seeing. Starting with a sampled star image taken from the image itself, the Lucy-Richardson deconvolution algorithm attempts to discover the source image that most probably produced the degraded image captured by the image sensor. For sharpening images with soft star images, deconvolution is hard to beat.

Gradient Correction. It would be nice if the flat-fields used in calibration would correct all of the sources of nonuniformity in images, such as gradients in sky brightness, hot spots, and scattered light. However, a large class of algorithms exist that can remove these unwanted effects from digital images. The simplest gradient corrections fix images that are brighter on one side than on the other; sophisticated gradient corrections can flatten images with lumpy sky backgrounds.

Noise Removal. Image smoothing, noise filtering, and interative noise removal are different approaches to making images less noisy. Image smoothing operates by averaging pixels below a threshold value; noise filtering scans an image and removes the pixels that exceed local variations; and interative noise filtering uses wavelet analysis of noise amplitudes to remove statistically non-significant variations in pixel value. Applied to the luminance channel of a color image, these techniques give rough or grainy images a milky smoothness.

Was this article helpful?

0 0
100 Photography Tips

100 Photography Tips

To begin with your career in photography at the right path, you need to gather more information about it first. Gathering information would provide you guidance on the right steps that you need to take. Researching can be done through the internet, talking to professional photographers, as well as reading some books about the subject. Get all the tips from the pros within this photography ebook.

Get My Free Ebook


Post a comment