Visible light instruments include normal cameras that gather light and project it on a light-sensitive surface, just like the camera with which you take your holiday pictures or videos.
In early satellites and space probes, pictures were actually made on conventional film. Earth observation satellites could return that film by launching it back to Earth on board a small re-entry capsule.
As that was much too complicated for interplanetary probes, these robots developed their film on board. This was done by putting a lamp on one side of the film and a light-sensitive sensor on the other side. Dark parts on the negative do not let much light through, while transparent areas do. In this way the film could be scanned by coding the amount of light the sensor received into series of l's and O's. The resulting numbers were then transmitted by radio to Earth, where they could be translated back into pictures.
Later, improved systems were based on television camera technology. Images were made up by scanning an object line by line, encoding the data and sending it back to Earth.
Modern space cameras work just like digital cameras that, on Earth, are now becoming more popular than film cameras. The captured light of an image falls directly on a sensor and is immediately digitally encoded. There is no longer a need for film or line-by-line scanning.
The sensor that makes this possible is called a CCD, for Charged Coupled Device. A CCD consists of a vast matrix of detector elements, called pixels. Each pixel converts the energy of the fight particles it receives into an electrical charge. The brighter the hght that hits a CCD pixel, the higher the resulting electrical charge that accumulates on it.
This electrical charge is subsequently converted into a code of l's and O's, so it can be stored in the spacecraft's computer memory. The data is then relayed to Earth by radio, where the signals of all individual pixels are combined according to the matrix layout in the camera. The end result is a complete picture from another world, much like your computer converts coded data in its memory into a picture on its screen.
The individual images are in black and white, because the CCDs only register amounts of light and not its color. However, the cameras can make different images of the same object through different color filters. Using a red filter for instance means that what the CCDs register is the red fight from the target; fight of other colors cannot pass the filter. Combining the images made through a minimum of three filters (red, blue and yellow) allows us to create a realistic, full color image.
Spacecraft observing planets from space often carry more than one camera. Optical instruments with a wide field of view can see a large part of a planet, but not focus on small details. Cameras with a narrower field of view, called narrow-angle, cover less surface but may be able to show details less than a meter in width.
The broad view, or wide-angle, systems are utilized on the Earth-orbiting meteorological satellites that make the images for the weather forecast on television. On interplanetary spacecraft they are used to capture the results of massive geological forces, such as giant volcanoes, extensive canyons and enormous impact craters.
Those that can focus on small objects - the narrow-angle cameras - can be used to study the edge of a crater, or the blocks of stone that were expelled by the impact of a large meteorite.
When talking about space cameras, the word "resolution" is often used. The resolution is the size of the smallest objects that can still be individually distinguished on a picture made by a camera. For example, a system with a resolution of 2 meters can make out a car from space, but not the smaller object, the driver, standing next to it. With a resolution of a few centimeters you can read a newspaper's headlines from orbit (apparently modern Earth-orbiting spy satellites are able to do this, but no one working with them is allowed to admit that, of course).
In CCD cameras, the resolution is identical to the size of an object represented by one pixel on the digital photo. When you zoom in on such a picture, you'll see that it is built up of individual blocks. Each block represents a pixel and thus objects smaller than such a block cannot be seen as individual items.
Another important parameter for a camera is its sensitivity - a measure for the weakest light source it is able to detect. It's similar to the ISO number for the film in conventional cameras. The higher the sensitivity, the less light is needed to make an image.
The nice thing about visible light cameras is that they give us a look at a planet as if we were actually there. You don't need to be a trained scientist to recognize a huge canyon on a picture of Mars or a crater on the Moon. Because of this, the pictures in visible light are the ones that end up in newspapers and on the covers of popular magazines. They are turned into posters, downloaded to serve as wallpaper background for computers screens and decorate mouse pads, T-shirts and coffee mugs.
With two cameras looking at the same thing under slightly different angles we can make stereoscopic images. When looking through a viewer that shows one image to your left eye and the other to your right eye, you get a perception of depth in the image. This is a great way to get a better understanding of a planet's landscape in three dimensions. Moreover, by using both images you can calculate the height of mountains and other features in the picture.
Stereoscopic images make for great posters and popular books as well; the three-dimensional aspect can be cheaply reproduced by printing each picture in different colors and supplying a pair of glasses with one green and one red lens. The green lens allows only green light to pass, so that one eye sees only the green image and, similarly, your other eye sees only the red image. In this way each eye sees a different picture with a slightly different view angle, resulting in a three-dimensional image.
NASA's Sojourner Marsrover and its two recent Mars Exploration Rovers had stereo-camera systems mounted to obtain 3-D images of the Martian landscape.
ESA's Mars Express orbiter also makes stereoscopic images, but uses only one camera. As it flies over an area, it looks at features on the ground from different angles at different times. Combining the views with a
computer on Earth, it is possible to create very rich stereoscopic images with an unprecedented amount of detail. Mars Express is the first spacecraft to apply this novel technology.
Apart from making great pictures, visible fight can also be used to make spectroscopic measurements. With a spectroscope you can see the overall spectrum of radiation an object reflects or emits, and the absorption and emission lines that indicate which atoms are present. With this information, you can thus find out the material something is made of, purely by the light you receive from it.
For instance, we know very well what sunlight reflected by water looks like because we can study that here on Earth. If a satellite in orbit around another planet detected a spectrogram identical to that of water on Earth, we would know we have had found water on another world.
Was this article helpful?