## Apparent Magnitude

Apparent magnitude is a measure of how bright a star appears (see Section 1.7). The modern magnitude scale defines a first-magnitude star to be exactly 100 times brighter than a sixth-magnitude star.

This ratio agrees with the way our eyes respond to increases in the brightness of stars. What we see as a linear increase in brightness (a difference of one magnitude) is precisely measured as a geometrical increase in brightness (the fifth root of 100 or 2.512 times brighter).

Magnitude differences between stars measure the relative brightness of the stars. Table 3.2 lists approximate brightness ratios corresponding to sample magnitude differences.

Remember that the most negative magnitude numbers identify the brightest objects, while the largest positive magnitude numbers identify the faintest objects.

Refer to Tables 3.2 and 3.3. How much brighter does the Sun appear than Sirius? Explain.

Answer: 10 billion times brighter.

Solution: Magnitude difference is (-26.7) - (-1.4) = 25, corresponding to a brightness ratio of 10,000,000,000:1.

TABLE 3.2 Magnitude Differences and Brightness Ratios

Difference in Magnitude Brightness Ratio

TABLE 3.2 Magnitude Differences and Brightness Ratios

Difference in Magnitude Brightness Ratio

 0 1 1 1 2.5 1 2 6.3 1 3 16 1 4 40 1 5 100 1 6 251 1 10 10,000 1 15 1,000,000 1 20 100,000,000 1 25 10,000,000,000 1

TABLE 3.3 Sample Magnitude Data

Subject Description Apparent Magnitude Absolute Magnitude

TABLE 3.3 Sample Magnitude Data

Subject Description Apparent Magnitude Absolute Magnitude

 Sun -26.7 4.8 100-watt bulb At 3 m (10 ft) -18.7 66.3 Moon Full -12.5 32 Venus At brightest -4.7 28 Sirius Brightest star -1.4 1.5 Alpha Centauri Closest seeable star 0 4.4 Andromeda Galaxy Farthest seeable object 3.5 -21

3.15 ABSOLUTE MAGNITUDE

Absolute magnitude is a measure of luminosity, or how much light a star is actually radiating into space. If you could line up all stars at the same distance from Earth, you could see how they differ in intrinsic, or "true," brightness.

Astronomers define a star's absolute magnitude as the apparent magnitude the star would have if it were located at a standard distance of 10 parsecs from us. With the effects of distance canceled out, they can use absolute magnitude comparisons to determine differences in the actual light output of stars (Figure 3.14). If a star is farther than 10 parsecs from us, its apparent magnitude is numerically bigger than its absolute magnitude. (Large positive magnitude numbers indicate faint objects.) For example, Polaris is 130 pc away. Its apparent magnitude is +2.0, whereas its absolute magnitude is -4.1.

On the other hand, if a star is closer than 10 parsecs, its apparent magnitude is numerically smaller than its absolute magnitude. Thus, Sirius is 2.6 pc away. Its apparent magnitude is -1.4, whereas its absolute magnitude is only + 1.5.

Consider the two bright stars Deneb and Vega. Refer back to Table 1.1 to fill in the chart below. Then tell (a) which looks brighter?_(b) Which is really more luminous?_(c) What factor makes your answers to (a) and

Two Bright Stars

Apparent Absolute Star Constellation Magnitude Magnitude

Answer: Chart: (a) 1.25; (b) -7.5; (c) 0.03; (d) 0.6.

(a) Vega (numerically smaller apparent magnitude). (b) Deneb (numerically more negative absolute magnitude). (c) The distance the stars are from us. 