The Deep Red Advantage

At first it might not be clear how the LRGB technique can be applied to curing atmospheric dispersion. Surely, in the luminance image, the full dispersive smearing is still captured, but is it simply a monochrome smearing? Well, this is true, but

Figure 8.3. Once LRGB processing is checked, the LRGB sliders in Registax allow the luminance signal to be constructed from any mixture of the color channel signals.

amateurs have developed some powerful methods for coping with a planet that is low down. For a start, the luminance signal need not be derived from an unfiltered monochrome image, or even a UV-IR rejected image. A narrow-band filter that captures the maximum contrast on a specific planet can be used to dramatic effect. If this narrow-band image is taken in the deep red, the results can be very dramatic indeed. To anyone who knows about optics this might seem counterintuitive. After all, blue light has a smaller wavelength than red light and a smaller wavelength means higher theoretical resolution, right? Wrong, in this case, basic theory and practice do not line up. Atmospheric seeing is far, far better in the deep red even if the wavelength is longer! A near infrared filter is the closest thing you can get to a seeing filter. If you can take a luminance frame in the near infrared, the resolution will be superb, especially on high-contrast objects like the Moon and Mars. Remarkably, Jupiter also responds well. CCDs are especially sensitive in the red end of the spectrum and less light is scattered in the red, too. If you do not care about color information, then things are really looking up. Near-infrared monochrome shots of the Moon, Mars, and Jupiter can all look breathtaking, even when the object is at a fairly low altitude.

If you do want a color image, then you can indulge in some pretty clever techniques to exploit the extra resolution gained in the deep red. This is especially powerful on Mars, where the planet is mainly red or pink in appearance anyway and using red as a luminance image will not introduce any grossly inaccurate color effects. Damian Peach has produced some stunning color images of Jupiter at only 30 degrees altitude by imaging the planet with a monochrome AtiK webcam through deep red (for the sharpest frames) and blue filters. To synthesize green he just averages the deep red with the blue signals. The resulting images look identical to RGB images, except green was never used! In the case of Jupiter, this saves valuable time as Jupiter rotates quickly. As far as I am aware, this "synthesis of green by averaging red and blue" technique was first pioneered by Antonio Cidadao. I still find it amazing that a full-color image can be obtained from just red and blue! However, there are not many green planetary features, so synthesizing green is probably not a critical test. If, to gain extra sharpness, the luminance signal is biased more toward the red, some color imperfection can be expected, but this is a small price to pay for a sharp image. In practice, Mars' color does not suffer much from a red luminance, Jupiter does, and the luminance image for Saturn is often chosen as green or green + red, but different imagers have different techniques.

At this point the reader might become just a bit confused at exactly what color is all about. Surely, there are only red, green, and blue subpixels on a computer screen? Where does the L value come in? Is there an extra L subpixel? No, there is no L subpixel. PC screens are made from an array of red, green, and blue subpixels, but the colors you actually see are all to do with the relative ratios of red, green, and blue subpixels. However, the luminance (brightness) you see is concerned with how brightly those subpixels are glowing. You can have a yellow object (red and green subpixels equal in brightness, blue subpixel turned off) but if the luminance value is low for that pixel, then all the subpixels will be dimmed by the same amount. That luminance (brightness) value can be derived from any narrow-band filter you like. However, despite the fact that luminance determines the perceived brightness/contrast between different pixels and not the color, it can skew the perceived color dramatically. After all, if you imaged a red planet like Mars through a blue filter and used blue for the luminance information the planet would appear virtually black. The RGB information might say it was still a red planet but an almost black red planet does not look like the bright red/orange Mars we are used to! Mars, as I have said above, is the best planet to use a deep red filter with because not only is the planet red anyway, seeing is always better in the red. An LRGB image of Mars with the L component derived from a deep red filtered image can give a pretty true color rendering.

Blue is a real problem area in planetary imaging and not only because of the poorer seeing and greater atmospheric scattering in the blue end. The techniques used to compress webcam information, on the journey from webcam to PC, also favor the red colours, as does the CCD sensitivity itself. Even when imaging deep sky objects, filtered LRGB exposures often require twice the blue exposure as the red images. In fact, LRGB was originally invented for deep sky and not planetary work. Because filtered deep sky exposures are so noisy with respect to unfiltered ones (because so much light is lost with already faint objects), it was realized that to get a deep high-resolution image of a galaxy or nebula it was always best to take an unfiltered monochrome image: the signal-to-noise was just so much better. However, it was then realized that if you literally color the resulting low-noise monochrome image with the color ratios from the noisy color image you get the best of both worlds: a clean, deep image with added color. At the pixel level, the ratio of brightness between the red, green, and blue subpixels colors the scene, and the depth of the monochrome image provides the low noise. In fact, the eye-brain perception of color helps here, too, because the eye is very sensitive to luminance resolution and far less sensitive to color resolution. So insensitive, in fact, that filtered images for the deep sky color information can be taken at half the resolution, i.e., pixel info can be binned 2 x 2 so that 4 pixels contribute to the light measured. Indeed, for deep sky work the color information can even be provided from a smaller telescope.

The LRGB technique is not quite as transforming in this respect for lunar and planetary work because, lets face it, the Moon and planets are very bright objects. But where the technique is at its most powerful here is in allowing deep red, narrowband, and nonblue filters to be used as the luminance signal, at colors where seeing is better and a planet has more detail. However, the LRGB technique does have a big disadvantage for planetary work. Namely, planets rotate, and, in practice, filters need changing in a short period of time. This can also necessitate some refo-cusing work, and everything has to be completed in a few minutes. If you can get by with just two filters, i.e., deep red and blue, in a reliable filter wheel, things need not be too fraught, but beginners will certainly prefer a one-shot color webcam approach on objects at a decent altitude, where serious dispersion and seeing problems are not an issue. One technique I have used to good advantage when imaging Mars is the two-webcam approach. Mars is a small planet with a much slower rotation period than Jupiter and Saturn so there is plenty of time to take both a deep red AVI video with a monochrome, filtered webcam and a color video with a ToUcam Pro webcam. However, using a sensitive monochrome webcam and proper color filters is best. Commercial low-profile filter holders are widely available (Figure 8.4) or you can make your own, as I did (Figure 8.5). When buying a filter set I would advise purchasing green and blue filters with an infrared rejection coating, but a red filter without an infrared rejection coating, as well as

Figure 8.4. A commercial, lightweight, low-profile, manual filter wheel attached to an ATiK webcam. A low-profile system is of particular advantage to the Newtonian user where the light cone is fixed. Image: Jamie Cooper.
Figure 8.5. An ultra-low-weight home-made filter holder built by the author and his father for the author's 250-mm f/6.3 Newtonian. The filters are mounted in 13-mm-thick slabs of Perspex and slide down channels in a groove cut in the side of a 50-mm drawtube. Image: Martin Mobberley.

an infrared (I band) filter (from 700 to 900 or 1000 nanometers). The infrared (I band is 700-900 nanometers) filter is incredibly useful for low-altitude imaging of the Moon, Mars, and Jupiter and the non-IR blocked red filter is great for providing a strong red signal on high-altitude objects. CCDs are very sensitive in the infrared and it is a shame if the infrared component is not used in filtered work. Conversely, with a color webcam, a UV-IR blocker is recommended for low-altitude work to restrict the effects of dispersion. The effects of atmospheric dispersion on a low-altitude Mars are clearly shown in the ToUcam image in Figure 8.6. A huge improvement is gained by using an infrared luminance filter and realigning the color layers as seen in Figure 8.7. Saturn seen through red (non IR blocked), green (IR blocked), and Blue (IR blocked) filters is nicely shown in Figure 8.8. Figure 8.9 shows the Maxim DL software tool for creating LRGB images.

Telescopes Mastery

Telescopes Mastery

Through this ebook, you are going to learn what you will need to know all about the telescopes that can provide a fun and rewarding hobby for you and your family!

Get My Free Ebook

Post a comment