getting it right

At the Princeton meeting in the summer of 1996T Saul Pcrlmuttcr dropped a bombshell right at the epicenter of cosmology. The supernova evidence accumulated by his group at Lawrence Berkeley Lab favored a universe that was decelerating due to dark matter, with £1 near one. Our high-jzteam didn't have much to say because we didn't have any results. We had methods that we thought were pretty good, we had found some supernovae and we had some data in the pipeline, but we didn't have our own Hubble diagram to compare with Saul's.

At that same meeting, Mike Turner presented work that he and Lawrence Krauss had been developing. What if the total Q is one, but the dark matter density is just wrhat it appears to be, £2r[1 = 0-3-These statements could both l^e true if something besides dark matter contributes to the energy density of the universe. What if the rest of the energy density is made up of smoothly distributed dark energy, so that £iAl the energy density associated with the cosmo-Iogical constant, is a significant fraction of the universe? A similar set of arguments had been advanced by Paul Steinhardt and Jerry Ostriker in a recent Nature article. When I want to tease Jerry (always), I say that he applied deep ideas, noting that if £i = £ii:i + £iA and if you just knowr in your heart that inflation means £2=1 and you know from observation that £1^ = 0.3, then using the powerful theoretical method of "subtraction" even I could compute that iiA =

0.7. If the universe isn't made of dark matter, it must be made of dark energy. Theory really isn't so difficult.

Like all effective teasing, this is a little unfair, because there is another cosmological fact that Turner and Ostriker and Steinhardt could match with A, but could not match without it. That is the age of the universe. If the Hubble constant was something like 80 kilometers per second per megaparsec, as initial observations with the Hubble Space Telescope then suggested, there was a real problem. If iim = 1, then the true age is two-thirds of the apparent age, because of deceleration. The apparent age, the Hubble time, is 12 billion years, and two-thirds of 12 is 8 billion years. This was not in good accord with the best measurements for the ages of the globular cluster stars, which appeared to be older: at that time, the experts put the globular cluster ages around 15 billion years. So, according to the logic of the ease, there was a problem with ft,., =

1, and a need for to make up the balance of the mass-energy in the universe.

Turner, Ostriker, and Steinhardt are excellent debaters. They make a ease like prosecuting attorneys. Listening to the presentation, you are inexorably swept along to the conclusion. Except science is not law. Convincing the jury is not enough. Although you would always like to convince the jury of informed opinion that your view is correct, the data have the final word, And Saul Perlmut-ter had presented data on supernovae that indicated deceleration and near one, and that left no room for this reincarnation of the cosmological constant. Theorists are valuable as long as they are stimulating. It is not so important for them to be correct. Observations, on the other hand, are useful only when they are right.

The result that Saul presented at Princeton was published in July 1997 in Ihe Astrophysical Journal. Their best estimate of Qm was 0.88 and they asserted that the data put the strongest known upper limit on the energy density associated with the cosmological constant of fiA less than 0.1. Of coursc, this was just a preliminary result, and the SCP promised much more data in the coming year, but they had put their stamp on the field. They took the theoretical argument head on, saying rhcir results were 41 inconsistent with

A-dominated cosmologies that have been proposed to reconcile the ages of globular cluster stars with higher Hubble constant values."1

The group at Lawrence Berkeley Lab were cool to the idea of another group working in the same area. But our own high-2^ team didn't need their permission. We just had to make the case to the people who decide about scarce telescope lime that it was worthwhile to have another team at work on this important subject. I thought we had a good case because of the depth of experience of our team in studying supernovae over the past 20 years and our collective mastery of the tricky problem of doing accurate photome-try on faint objects. People on our team had built up the entire sample of nearby supernovae that either team would need to compare to distant supernovae. We had invented the techniques for making SN la into good standard candles using colors and light curve shapes to compensate for dust and intrinsic variations among SN la. Besides, this was an important problem and it would be good to have two teams work on it to see if the answers agreed.

This argument was successful, we got assigned the telescope time, and we began to search for and observe high-rcdshift supernovae. Brian Schmidt and Nick Suntzeff catalyzed the formation of the high-z supernova team. Brian was on his way from the Harvard-Smithsonian Center for Astrophysics (CfA) to the Australian National University, Our gang at the CfA included Pete Challis, Peter Garnavich, Saurabh Jha, and me. From Cerro Tololo, Nick engaged Mark Phillips, Mario Hamuy, Bob Schommer, and my former Ph.D. student Chris Smith. At Berkeley, there was Alex Filippenko and Adam Riess, who had finished his Ph.D. at Harvard and was now a prestigious Miller Fellow at Berkeley. Alex (who had himself been a Miller Fellow a decade earlier) had been part of the LBL team, bur once we got our high-jz act together, he chose to work with us.2 We were very glad to have him. Berkeley graduate students Alison Coil and Ryan Chornock pitched in later. We had strong connections with the European Southern Observatory, with Bruno Leibundgut, who had been my postdoc, and Jason Spyromillio, who had done beautiful work on SN 1987A. We enlisted help from the University of Washington, too, with Craig Hogan, Chris Stubbs, and his students Alan Diercks, David Reiss, and Gajus Miknaitis. Alejandro

Figure 10.1. The high-* team. A large fraction of the high-z team in a single place for I /30th of a second in the summer of 2001. Courtesy of Robert Kirshner, Harvard-Smithsonian Center for Astrophysics.

Clocchiatti moved from Texas lo Chile, which gave us a shrewd spectroscopist in Santiago to help press the work ahead.

As time went by, some students finished their degrees and left, while new ones joined. And, as the methods and aims of the program evolved, we added lion Gilliland at the Space Telescope Science Institute, and John Tonry at the University (if Hawaii and his student Brian Barris. Our greatest technical achievement was to make up a team cover sheet for talks and proposals, with all the logos of the institutions involved. But it betrayed our prejudice. The cover sheet said, "The High-XSN Search" and went on to say "Measuring Cosmic Deceleration . . . with Type la Supernovae." In the end, we did not measure cosmic deceleration, but something else.

By astronomical standards, where a typical research group has a faculty member, perhaps a posldoc, a student or two, and a per dog, this wras a big group. On the other hand, compared to particle physics research teams of the type they were accustomed to assembling at LBL, this was an intimate club. Our team needed to be big because of the peculiar requirements of a supernova search. New supernovae are like fresh fish. If you don't use them right away, they spoil. So our search and follow-up had to be carefully orchestrated and intense. Just as for Zwicky at Palomar in the 1930s, the Vikings before us, or the Calan/Tololo search, the rhythm of the observing was set by the phase of the moon. First you need a template—the "before" image taken in the dark phase of the moon. You wait a month for the moon to cycle through its phases, then repeat the same field in the next dark run.

Now the clock is running. There may be new supernovae in your data, and you have to find them before they fade into use-lessness. Working round the clock, fueled by Chilean pizza, Tucson tacos, or Kona coffee, team members struggle with the software to get all the images aligned, blurred to match, scaled to the same brightness levels, and subtracted. Sometimes it goes smoothly, sometimes not. But always there is a sense of urgency.

The automated software spits out postage-stamp-sized images of possible candidates: places where there is a 5o something on the second image that wasn't on the first. Not everything that glitters is gold. Somebody has to look at every one of these events to see if the software has done something stupid. There are satellites, asteroids, electronic noise, diffraction spikes, bad subtractions, bad columns, hot spots, cosmic rays. And supernovae. Somebody has look at the image to tell the difference. It is tedious, hard work done under pressure. The clock is ticking, not just because the supernova might be fading, but because the follow-up observations are already scheduled and people are moving into position to take those data. But they can't take data if we don't find the supernovae.

For a typical observing run, we take dozens of images with the largest CCD cameras we can get our hands on. The big cameras have 100 million pixels—about 30 times the size of a "high-resolu-tion" digital camera you can buy at Circuit City. The data from a single exposure fill 30 good-sized monitor screens, and a typical night produces 30 images. So that means we need to scan through almost 1000 screens-full. Each image has thousands of galaxies of about the right distance to be interesting sources of supernovae for cosmology. So if there's a supernova every 100 years in a typical

Figure 10 2 Suprime a giant CCD camera. Ttie advene of very large electronic camera is the technical advance that made high-z supernova searches practical. These cameras have dose to IOQ percent efficiency using silicon charge coupled devices (CCDs). This one has about 100 million pixels compared to 3 million in a high-end digital camera you can buy today. Courtesy of Subaru Observatory.

Figure 10 2 Suprime a giant CCD camera. Ttie advene of very large electronic camera is the technical advance that made high-z supernova searches practical. These cameras have dose to IOQ percent efficiency using silicon charge coupled devices (CCDs). This one has about 100 million pixels compared to 3 million in a high-end digital camera you can buy today. Courtesy of Subaru Observatory.

galaxy, we should see several in each observing run. If the weather is good. If the software works properly.

While some team members are sifting the data for new stars, others are already on the way to big telescopes to follow up the discoveries. It is the strangest form of observing. Usually, you do meticulous preparation long in advance. You make a list of your targets, make finding charts of their locations so you can identify them at the telescope, and think through just how to use your nights so you don't waste observing time. But for the supernova follow-up, there's no way to do all this in advance. So you travel to Tucson or to Kona or to La Serena with nothing prepared. While you are in the airplane, teammates are, you hope, generating a list of good candidates: new dots on the images that might be supernovae halfway across the universe. It's a heart-wrenching way to observe.

While we try to provide a few days between the search and the follow-up, sometimes that margin gets eaten up by glitches in the data processing. Then the sickening possibility of wasted time on the largest telescope in the world begins to gnaw at the observers. Alex Filippenko could be at the Keck Observatory in Hawaii waiting with the suppressed tension of a drag racer at a red light while Pete Challis is still slaving away in Chile, sorting reality from illusion.

On a calm day, Alex is a bundle of nervous energy. This relentless attention has served him well—Alex has become one of the most productive astronomers in the world. Slender, intense, and focused, he has the fast-twitch muscles of the star tennis player he is and the eating habits of a fast-food junkie. On the afternoon of an observing run, Alex snarfs Cheese Doodles while his bouncing leg communicates his anxiety. Has Pete put the targets at the team website?

"Not yet."3

When twilight begins in Hawaii, Alex walks across the Keck parking lot to the nearby McDonald's and buys a bag of Big Macs. If the targets are still not posted, the tension is contagious. Alex becomes like Sherlock Holmes without a case. In Jhe Adventure of the Wisteria Lodge, Sherlock says, "My mind is like a racing engine, tearing itself to pieces bccause it is not connected up to the work for which it was built." But once Brian Schmidt and Peter Garnavich get the observing list in order, the Keck dome is open, and it's time to get to work, Alex is the best guy to have in the pilot's seat because he focuses all that energy on the task at hand. Paying attention doesn't make the photons come in faster, but it helps you anticipate what to do next, and avoid wasting precious telescope time. Later in the night, Alex refuels with hamburgers, without regard for temperature, freshness, or the texture of the congealed cheese, and washes them down with strawberry soda. While others' attention drifts, Alex never flags, squeezing every minute of data from a night at the mighty Keck.

We get spectra of the supernova candidates at the Keck or the Very Large Telescope that ESO runs in the north of Chile. A new dot might be a supernova, but it might be something else. A spectrum will tell you if you've selected a variable quasar (oops!), a type II supernova (close but no cigar), or the SN la we know how to mold into the best of standard candles. The spectrum will also reveal the redshift, so we know where to put the supernova on one axis of the Hubble diagram.

But this is hard work. The supernova light is only about 1% of the light coming into the spectrometer from the sky. So it requires meticulous subtraction to see clearly what you've got. And you need to make decisions rapidly, to work through the list of candidates to find the genuine SN la. This combination of careful work and rapid decisions is a volatile mix. It's best to divide the labor, with somebody who is computer-nimble (under 30) doing the data reduction, a skilled operator who knows the telescope and the instruments at the controls, and someone in the role of Mr. Spock to provide logical advice on what to do next. Add in uncertain weather, balky instruments, and jet lag to brew a cauldron of stress.

But the results have been very good. Even with mediocre weather, we usually find several type la supernovae per search night in the redshift range from 0.3 to 0.8 where the effect from cosmology is most accessible. For example, in 1999, two nights of searching at the Canada-France-Hawaii Telescope (CFHT) in I lawaii and at the Blanco Telescope at Cerro Tololo provided a list of 20 objects with spectra, 12 of which we were confident were SN la, which ranged in redshift from 0.28 to 1.2, This is the deep water where you can learn the history of cosmic expansion.

Then we measure the light curve. We need to know how bright the supernova was at maximum light. And we also need to measure the shape of the light curve to determine whether we are dealing with a typical SN la, one that was a little extra bright, or one that was a bit of a dim bulb. Plus, to measure the effects of dust absorption, we measure the supernovae through more than one filter to get the color. Most of the information about the shape of the light

Epoch 1

Epoch 2

Epoch 2 - Epoch 1

* •

Was this article helpful?

0 0

Post a comment