Looking farther afield, the prospects for learning more about the nature of supermassive black holes look very promising indeed - in both the near and distant future. Several major undertakings will improve our imaging and spectroscopic capabilities in both the radio and X-ray portions of the spectrum, and LISA (the Laser Interferometer Space Antenna) will open up a whole new window of opportunity for studying the distortions induced on the fabric of spacetime by violent gravitational interactions. As we have already seen (see Chapter 4), LISA's
3 See Green, Aldcroft, Mathur, et al (2001).
expected launch in 2010 will herald a bright new age of space exploration, stretching our frontier well beyond what radiation can let us see. By detecting gravitational waves undulating from distant blackhole sources, astronomers will be able to sense the behavior of massive objects in the presence of unimaginably strong fields, testing general relativity, and possibly even uncovering flaws that hint at new, more comprehensive descriptions and theories of nature.
The windows to be opened by ARISE (Advanced Radio Interferometry between Space and Earth; see Chapter 5) and more elaborate ground-based millimeter arrays will be equally fascinating and conducive to profound change in our communion with nature. Both of these developments - one stretching the baseline of radio interferom-etry into space, the other creating a worldwide baseline for interferom-etry at millimeter wavelengths - are geared toward greatly enhancing the resolving power of instruments designed to probe deeper and deeper into the bottomless well of gravity in supermassive objects. Many astrophysicists suspect that an image of the event horizon in a nearby black hole will be feasible within a matter of only years.4
Their impressive stature notwithstanding, existing radio telescopes (see Figs. 5.4 and 5.5) are not all usable at the shorter wavelengths because they cannot maintain sufficient structural integrity to provide a pure millimeter or submillimeter signal. So a major problem with conducting worldwide coordinated observations at these wavelengths is simply the paucity of appropriate sites.
The idea for developing a global network of millimeter telescopes, which has come to be known as CMVA - an acronym derived from Coordinated Millimeter VLBI Array - actually goes back to the mid 1990s, when members of the Haystack Observatory in Massachusetts developed plans to create the network for initial observations at 3 millimeters and additional experimental observations at 1.3 millimeters. Since then, the goal of the CMVA has been to
4 See, for example, Falcke, Melia, and Agol (2000), Bromley, Melia, and Liu (2001), and Melia (2003).
continually break new technological ground for later exploitation at progressively shorter wavelengths. Thus far, up to 12 stations around the world have been able to participate in global VLBI sessions at 3 millimeters, organized twice a year through the CMVA. Although unfavorable weather conditions and technical problems at some sites sometimes affect them, these campaigns are generally successful and provide good observations of compact emitting regions, including the galactic center.
At 1 and 2 millimeters, however, the number of telescopes is much smaller than at 3 millimeters, which greatly reduces the coverage. Thankfully, this situation is rapidly changing. For example, the new Heinrich-Hertz telescope on Mount Graham near Tucson recently participated in a VLBI experiment at 1 millimeter for the first time. Even more exciting is the proposed development of the giant radio telescope known as ALMA, which conveys better than any other project the growing enthusiasm from the world's astronomical community. The Atacama Large Millimeter/Submillimeter Array is conceived as a radio telescope composed of 64 transportable 12-meter-diameter antennas distributed over an area 14 kilometers in extent. In the early part of 2001, representatives from Europe, Japan, and North America met in Tokyo to sign a resolution affirming their mutual intent to construct and operate this facility in cooperation with the Republic of Chile, where the telescope is to be located. ALMA will be built on the Andean plateau at 5000 meters altitude near the Atacama Desert, and is considered to be the first truly global project in the history of fundamental science. The telescope is scheduled to be fully operational in 2010.
X-ray astronomy, on the other hand, must be conducted entirely above Earth's soupy atmosphere. The Chandra satellite - the latest NASA innovation - has merely given astronomers a taste of what X-ray images with exceptional spatial resolution can reveal. Scientists and engineers at the Goddard Space Flight Center in Maryland, at Columbia University in New York, and at CALTECH in Pasadena, among others, are conjuring up one of the most ambitious advances in the history of high-energy astronomy. Taking as their cue the lessons learned from the evolution of ground-based optical telescopes, in which many smaller units working in unison are in the end more powerful and easier to build than one single cumbersome device, these investigators are designing and building the Constellation-X Observatory (see Fig. 6.3). Four individual X-ray telescopes working together will have a combined sensitivity 100 times greater than any past or present X-ray mission.
More imaginative still is a NASA mission now under planning that purports to achieve nothing short of actually photographing the event horizon of several nearby supermassive black holes in X-ray light. A duo of powerful new NASA telescopes, with costs estimated in the billions of dollars, are being developed collaboratively by NASA and the University of Colorado at Boulder, and are proposed for flight before 2020. These telescopes are part of the Microarcsecond X-ray Imaging Mission, or MAXIM for short. The main mission would consist of a fleet of 33 spacecraft, each containing a relatively small telescope. But by combining the data gathered by so many separate instruments distributed over an extraordinarily large baseline in space, one may achieve a resolution of the sky about one million times better than what is currently attainable. A ground-based optical telescope with this same capability would enable us to read a newspaper on the lunar surface!
To put this achievement in context, note that at a distance of 60 million light-years, the event horizon of the 3-billion-solar-mass black hole in the nucleus of M87 (see Figs. 5.9 and 5.10) projects a diameter of 5 microarcseconds. MAXIM's intended resolution - the angular separation of features that it can identify - is about one microarcsecond, so future X-ray astronomers will be able to see the dark depression shimmering at the center of this giant elliptical galaxy. But with a projected width of over 30 microarcseconds, the easiest dark pit of all to photograph with MAXIM will be that projected by Sagittarius A* at the heart of the Milky Way.
This technology has its own problems to contend with.5 The wavelength of an X-ray is about 1000 times smaller than that of visible light, making X-ray telescopes very difficult to build. Surface irregularities that are too small to affect visible light can easily scatter X-rays. In addition, to obtain a true focus, X-ray photons must reflect twice from very carefully figured hyperbolic and parabolic surfaces, nested concentrically in very precise formation. Instead, MAXIM will utilize a method similar to VLBI, in which two or more telescopes are coupled in order to synthetically build an aperture equal to the separation of the individual instruments. Instead of precisely focusing X-rays with expensive mirrors onto a detector, the MAXIM team will use readily made flat mirrors to mix the photons, producing an even sharper image, similar to the way sound waves can be combined to either cancel each other out (resulting in silence) or amplify the sound when one crest adds to the other.
The concept calls for the fleet of smaller telescopes to be spaced evenly in orbit around the perimeter of a circle, the diameter of which will vary from 1 to 10 kilometers, and for the whole assembly to be orbiting about the Sun. From there they would collect X-ray beams and funnel them to a larger telescope stationed at the hub, which could then relay the accumulated data back to Earth, several million miles away.
6.5 is the universe itself a big black hole? The field of black-hole research is clearly in a period of renaissance, with wave upon wave of breathtaking discoveries creating headlines on a regular basis, it seems, and with future missions promising to take us to the edge of validity of current physical laws. Supermassive black holes are no longer the oddity of decades past, but rather a necessity in any comprehensive description of structure in the universe.
5 This work has been spearheaded by Webster Cash and his group at the University of Colorado, in collaboration with NASA's Marshall Space Flight Center in Huntsville, Alabama. They announced their design in the 14 September 2000 issue of Nature.
Some astronomers are taking this essential role to a rather daring conclusion, wondering, in fact, if we ourselves may be living inside the biggest black hole of all - the universe itself. Well, this question is not really well posed, as we shall soon see, but it does make for some intriguing reflection on cause and effect, and on the origin of all things.
A black hole is a parcel of closed spacetime embedded within a larger space (and time) that may contain matter, radiation, and probably other black holes as well. On the other hand, the universe as we know it is all encompassing, so for us to view it as a black hole, it would be necessary to hypothesize the existence of an undetected -and probably forever undetectable - hyperspace within which it is ensconced.
The major difficulty in maintaining a scientific posture with this discourse is that physicists do not yet have a complete theory unifying all the fundamental forces of nature at the instant of the Big Bang. They can say with some precision what transpired 10-43 second later, and any time thereafter, but that first fleeting moment borders on philosophy and aesthetics, not the rigor of verifiable hardcore science. For example, there is no possibility of linking current theories to experimentation with the early universe - that is, we cannot simply "build" another cosmos - so our theorizing must be accepted or rejected primarily on the basis of pure reasoning, and perhaps the power of prediction at later times.
The most unsettling, yet the most engaging, aspect of the Big Bang is the problem of beginning - the apparent singularity from which expansion started. An initial state of arbitrarily high density seems to be inescapable, just as catastrophic gravitational collapse evidently squeezes to zero volume matter falling into a bottomless well of gravity. In principle, understanding the process of gravitational contraction may resolve the mystery of our distant past, perhaps revealing new laws of physics along the way.
Still, certain issues pertaining to the question of the universe as a black hole may already be addressable within the current framework. Questions such as "Does the universe lie within its own gravitational radius, i.e., within its own event horizon?" and "What happens toward zero time in the current universe should we reverse the clock?" can at least be broached with the language of scientific principles already recognized and tested.
It may seem surprising to hear that the average density of matter within a black hole need not be extraordinarily large. Its value depends critically on how big the object is. The problem is simply to get enough material within a given radius to produce an event horizon at that radius. From Chapter 3, we recall that the Schwarzschild radius is 2GM/c2, so in effect the black hole's size scales directly with its total mass M. But for a given value of M, its density drops off inversely as the enclosed volume, which is proportional to the radius cubed. Thus, ponderous black holes actually have a significantly lower density than their lighter brethren. For example, if a 100-kilogram person were to suddenly shrink to black hole proportions, he would need to have a radius no bigger than about 10-23 centimeter, but his density would then rise to the extraordinary value of 1073 grams per cubic centimeter. The Sun, squeezed into a black hole, would have a 3-kilometer radius, but its density would be only 1016 grams per cubic centimeter.
Now consider what happens as we increase the mass further, to a value not unlike that of a typical supermassive black hole in the nucleus of an active galaxy. For a 100-million-solar-mass object, its Schwarzschild radius grows to 2.4 hundred million kilometers -roughly the size of Mars's orbit about the Sun. But its average density is incredibly only about 1 gram per cubic centimeter - the density of water!
An extremely large region of space, such as the universe, does not have to be very densely filled with matter in order to create curved light paths or even to entomb spacetime itself by forming an event horizon. Given that we see the universe from "inside," how does one then go about determining whether it is above its black-hole density or not? Part of the answer actually goes back to the work of Sir Isaac Newton who, in order to describe the moon's motion around the Earth, used the newly invented calculus to prove a very important theorem
122 the edge of infinity for his universal law of gravitation. He showed that the gravitational field outside a spherically symmetric body behaves as if the whole mass were concentrated at its center. In other words, the moon feels exactly the same gravitational influence from the Earth as it would from an object with the same mass, though only the size of an apple situated at the center of where Earth now stands.
In 1923, not long after general relativity was established, George Birkhoff (1884-1944) made the surprising discovery that Newton's theorem was valid even for this more comprehensive description of gravity, though with some appropriate corrections. He demonstrated that even if a spherically symmetric body were collapsing or expanding radially, the Schwarzschild metric describing its gravitational field in empty space would not change in time. In other words, the effect of gravity outside a spherically symmetric body does not depend on how big that object is - it is based solely on how much mass is enclosed within its surface.
The Birkhoff theorem seemed peculiar because in general relativity a nonstatic body generally radiates gravitational waves. We now know that in fact no gravitational radiation can escape into empty space from an object that looks the same from all directions, unlike the pair of black holes orbiting about each other in Fig. 4.5. His result may be applied with equal validity inside an empty spherical cavity at the center of a spherically symmetric (though not necessarily static) body. Here, however, there is no enclosed mass at any point within the cavity so, according to his theorem, there is no gravitational field anywhere inside it.
The value in Birkhoff's work is that, under the assumption of uniformity, we can calculate the gravitational field anywhere in the universe relative to another point a distance d away, by simply estimating how much mass is enclosed within a spherically symmetric volume of radius d centered on that other point. For the sake of specificity, let us just put ourselves in the middle and see how far out we need to go before we hit the universe's event horizon.
According to Hubble's discovery of an expanding universe back in the 1920s and 1930s, distant objects are receding from us with a velocity proportional to their distance. It turns out that this rate of recession approaches the speed of light for matter 12 billion light-years away, and this must therefore be the radius of that part of the universe with which we have interacted via influences that travel at the speed of light. (Two specific examples are electromagnetic and gravitational waves.) It is what astronomers call the size of the visible universe.
Birkhoff's theorem tells us that the average internal density required to produce an event horizon at 12 billion light-years is about 5 x 10-30 grams per cubic centimeter - an incredibly small number, the equivalent of only six hydrogen atoms per cubic meter. Even so, it exceeds the best current estimates astronomers have made by a factor of roughly three to five, depending on which newspaper vendor you talk to. Could the dark energy invoked to explain the universe's acceleration make up the difference (see Chapter 3)? Without it, the visible universe could not be a black hole in the strictest sense of the term, though it would come alarmingly close. Let us think about this for a moment. Of all the possible average densities that the universe could have had, why is it that the one with which it is apparently endowed is so strikingly close to the value needed to create an event horizon at the edge of what is visible?
Perhaps the answer lies in another important consideration we have so far ignored in this discussion. According to current cosmo-logical models, the expansion of the universe is driven not by matter moving through space, but rather by the stretching of space itself. This is more than just an idle concept since the very idea of inflation depends critically on the validity of an expanding space, and without inflation (see Chapter 3), many problems with the basic Big Bang model would go unsolved. The expansion of space, however, can proceed faster than the speed of light. The postulates of special relativity do not apply to this phenomenon, since they only specify what the maximum speed of transmission through the space can be, and that is the speed of light. So although we may not be able to see the "rest" of the cosmos beyond the visible limit at 12 billion light-years, it may nonetheless be there and expanding in concert with our own visible universe.
Can we therefore extend the radius of our Birkhoff sphere and intersect an event horizon by going beyond the "visible" limit? Well, no. For one thing, if this region is beyond the visible edge of the universe, then it is forever inaccessible to us, and we to it. The influence of gravity cannot travel faster than light either, so whatever mass is present there would never have communicated with the universe we can see, and they could never conspire to pool their influence and produce a common event horizon.
Nonetheless, the answer to the question "Is the universe itself a big black hole?" is a qualified "yes" because of several truly amazing observations completed by an international team of astronomers using the BOOMERanG experiment in 2000. We already touched on the significance of their findings in Chapter 4, but let us now revisit this discovery in the context of the present topic.
Designed to study the cosmic microwave background radiation with unprecedented accuracy, BOOMERanG surveyed 2.5 percent of the sky with an angular resolution of 0.25 degrees during a ten-day balloon flight over Antarctica. This microwave telescope was built to measure fluctuations in the background radiation (see Fig. 4.1) driven by pressure variations propagating throughout the nascent universe. A peak in the frequency of these variations was expected to occur 300 000 years after the Big Bang, when the matter and radiation ceased to interact via photon scattering. Earlier calculations had shown that a universe with a current average density of 5 x 10-30 grams per cubic centimeter would have produced fluctuations with a characteristic angular separation of about 0.75 degrees, well within BOOMERanG's resolving capability.
The team of astronomers who conducted this investigation, led by Paolo de Bernardis of the University of Rome and Andrew Lange of CALTECH, reported that BOOMERanG not only confirmed a primordial origin for the fluctuations, but also clearly identified a peak precisely where these predictions had placed it. The location of the peak means that the density of matter in the universe is within a statistically determined error of only 10 percent of its critical value.
Physicists already know that the combined density of visible and dark matter, and radiation, amounts to only about one-third of the required 5 x 10-30 grams per cubic centimeter. So the rest of it must be the "dark energy" inferred from the accelerated expansion of the universe. Although the evidence for this phenomenon is still rather tentative,6 cosmologists find it very gratifying that together with the completely independent determination rendered by BOOMERanG, they now paint a self-consistent picture. The cosmos is evidently dominated by dark energy, but in such a way that its overall equivalent mass density is precisely 5 x 10-30 grams per cubic centimeter. The universe, it seems, has an event horizon with a radius of 12 billion light-years, right at the edge of what we can see before the velocity of expansion exceeds the speed of light.
This universe, however, has no apparent singularity right now -its mass is spread out everywhere. Could it be that the Big Bang was nothing more than the initial collapse of the universe to something approaching a point, followed by a bounce? Yes, it's possible, but we may never know for sure because the first 10-43 second of the expansion is completely unresolvable with current scientific methods. Let us reverse the clock, and see how far back our present knowledge can take us toward the beginning, and why this interval of 10-43 second, known as the Planck time, appears to be impenetrable.
The shortest interval of time that can be probed with current physical laws pushes their applicability to the limits set by three so-called fundamental constants of nature. These are the measured values of quantities that characterize the strength of gravity, the speed of light, and the fuzziness of quantum mechanics. Physicists assume
6 See Brian P. Schmidt et al. (1998) and Saul Perlmutter et al. (1999).
that these quantities are constants in time, in the absence of any evidence to the contrary.
Quantum mechanics argues that we can never be entirely sure of a particle's position or its energy, because in order for us to even know of its existence we must disturb it to sense its presence. Thus, there should always be some positional uncertainty, or an imprecision in energy and time, and any description of the particle's physical behavior must therefore acquire some minimal level of "fuzziness." In our everyday lives, we develop the illusion of precision only because the fuzziness induced by these uncertainties is very small, and our mind clings to the apparent clarity of the outside world as a convenient simplification of the way things really are. Certainly, on a macroscopic scale, this fuzziness does not manifest itself readily, and our description of nature using exact positions and times is quite adequate for our need to interpret much of the activity in our environment. But on a microscopic scale, this fuzziness is paramount, and nothing can happen without the consequences of the implied imprecision.
The uncertainty in the particle's position is characterized by Planck's constant, h. The Planck length - the shortest distance we can probe - depends on how strong the effect of gravity is on such scales. This in turn is specified by the gravitational constant, G, in Newton's universal law of gravitation. The bigger this coupling constant is, the stronger is the attraction between two given masses. The Planck time is then the interval of time required to communicate information across this distance, given that the apparent maximum rate of transmission is the speed of light, c. Together, these constants yield the shortest physical time, (Gh/c5)2 (which is approximately 10-43 second), that anyone (or anything) can sample.
However, cosmologists do have some confidence in beginning to describe the expansion of the universe from 10-43 second onwards. This is where our quantum physics has meaning, because on this level the Schwarzschild radius from general relativity first becomes equal to the smallest scale permitted by the quantum fuzziness, roughly
10-33 centimeter, which is still much smaller than the nucleus of an atom. But there is still some remaining uncertainty because physicists diverge in their views of how one should best describe the universe at this point. They still do not know if extra dimensions exist (see Chapter 3), or if string theory is correct. One view has it that during the Planck era (when the universe was about 10-43 second old), the cosmos should best be described as a quantum "foam" of ten spatial dimensions containing Planck-length-size black holes, continuously being created and annihilated, with no cause or effect. The reason for the latter is that, on quantum scales, particles can be created without the conservation of energy, as long as they exist only fleetingly so that the violation falls within the uncertainty prescribed by Planck's constant.
One of the reasons our physics is incomplete near the Planck era is related to the hierarchy problem we discussed in Chapter 3. Science does not yet provide a description of how the forces of nature unify during this time. At the excruciatingly high energies and temperatures prevalent then, the forces of nature would have become symmetric, meaning that they would have resembled each other and would have acquired a similar strength - they would have unified into a single entity. Physicists are actively pursuing the grail of grand unification of all four forces, and have already achieved some notable success in this pursuit. Toward the end of the twentieth century, the interactions due to the weak and electromagnetic forces were framed into a single phenomenon known as the electroweak force by Sheldon Glashow, Steven Weinberg, and Abdus Salam, who were awarded the Nobel Prize in Physics for this effort in 1979.
The weak force, which is mediated by very heavy particles known as W and Z, is responsible for the transformation of a neutron into a proton within the nucleus of an atom, whereas the electromagnetic force provides an interaction between charged particles, such as the electron and a proton. At the time of their discovery in 1983, the W and Z particles were the most massive known - each weighing in at almost 90 times the mass of the proton - whereas the photon, the carrier of the electromagnetic force, is massless. The unification of these two forces occurs when the energy available for the process is so high that even this enormous mass difference between the two sets of carriers becomes inconsequential. In the early universe, this would have been the situation until the ambient temperature dropped below about 1015 Kelvin, after which the mass difference would have split the rates at which these particles could interact, thereby creating the appearance of two independent forces.
Attempts are now underway to unify the strong and electroweak forces, a process known as Grand Unification, but this is proving to be much more challenging, in part because what is required is the conversion of certain particles, such as electrons, into completely different types of entities, known as quarks. This unification, if possible, would result in a split of the rates of interaction when the temperature in the early universe dropped below about 1027 Kelvin, much closer to the Planck era.
The final unification, between the electroweak, strong and gravitational forces, is well beyond the realm of study with earthbound experiments, because the energies and temperatures required to approach the necessary scale of interaction are simply unreachable. It may seem peculiar, but learning more about the early universe may actually be necessary for this branch of particle physics to make progress of its own toward a "complete" understanding of what governs the substance and behavior of particles.
These unknowns impact the cosmologists' view progressively more and more, as they labor closer and closer to the Planck scale. The exploration terminates - indefinitely it would seem - at 10-43 second. Only the development of a completely new, overarching description of nature that obviates the fuzziness of quantum mechanics could change this situation. Still, physicists are a clever lot, so there is always hope. Is the Universe itself a big black hole? It now seems that the answer is yes, but how and why it got that way persist as the most profound mysteries in nature.
supermassive black holes in the universe 129 6.6 ultimate fate
Counterposing the uncertainty of what transpired at the very beginning of the Big Bang, the question of how the universe will play itself out may be easier to address, though, as always, the story unfolds through the prism of human perception and interpretation. It would be utterly presumptuous and self-debilitating for us to view this prognostication as absolute and fully written. On the contrary, it is an evolving narrative, likely to be swayed by many future developments and discoveries in particle physics and astronomy.
For now, the three leading characters in this play are the total mass enclosed within the visible universe, the Grand Unified Theory (GUT) that will ultimately account for the unification of all known forces, and Hawking radiation. Up until the era when the reservoir of primordial matter - primarily hydrogen and other light elements - is fully exhausted, stars will continue to form and galaxies will collide and grow. Looking into the future, however, matter will ultimately partition itself into several quasi-terminal states, among them dying stellar embers, white dwarfs and neutron stars, asteroids and planets, and tenuous gas dispersed throughout the cosmos. But regardless of what the eventual configuration will be, life as we know it will not be viable forever. Without the energy released from nuclear burning, life-sustaining environments will become untenable. In the meantime, supermassive black holes will continue to grow as clump after clump of gas succumbs to the inexorable inward pull of gravity, adding to the total mass entombed below the growing number of event horizons.
Life will undoubtedly evolve considerably and survive much longer than we could now imagine. In the absence of nucleosynthesis, our descendants may even find a way of using energy liberated by accretion onto black holes in order to power their survival. But certain processes predicted by the GUT will change the universe dramatically and irreparably, making any such attempts futile in the long run. In these theories, all sorts of particles can (and must) mutate into other entities, a process that may be induced by either collisions or spontaneous self-decay. A proton, for example, will eventually split into a positron (the electron's antiparticle) and a pion, the particle that helps to mediate the nuclear force. Neutrons are already known to be unstable; in a matter of only minutes, they decay into protons (this process is induced by the weak force), so they too must eventually split into sub-components. By permitting this conversion - nay, requiring it -the GUT will guarantee that the two most significant constituents of atomic nuclei will be removed permanently from the composition table. Diamonds are not forever!
Physicists still do not know the mass of several particles that mediate the unified force, so the time required for protons and neutrons to decay is uncertain. The best current estimates endow the proton with an expected lifetime somewhere between 1032 and 1041 years.7
By this time, galaxy collisions (see Chapter 4) will have been relegated to ancient history by the expansion of the universe, which would have continued to drive the participants apart. Supermassive black holes will therefore stop growing some day because they will have absorbed all the limited supply of matter in their environment. Estimates place the terminal mass of these objects somewhere between 1 billion and 10 billion Suns.
The universe in this era will be completely unrecognizable to sentient beings living now, since it would have mutated to the point where life itself would be impossible. As best as physicists can tell, the cosmos will be an extremely thin dark veil of fundamental particles, such as electrons, positrons, neutrinos, and highly redshifted photons. Very few atoms will be left, and these too will eventually vanish as their constituent protons and neutrons disintegrate. And floating aimlessly through this enormous sea of virtually nothing will be the ensemble of billion-solar-mass black holes roaming freely for a near eternity, sucking up whatever scant morsels they encounter.
7 A full discussion of the relevant parameters and other considerations may be found in Adams and Laughlin (1997).
Evidently, supermassive black holes appeared early in the history of the universe and will stay late - very late. After 1032 to 1041 years, they will be the only structures of any significance left in the cosmos. But in what appears to be the final act of fair play, even they will not exist forever. Once black holes stop growing, they slowly begin to shrivel via a loophole created by the application of quantum mechanics, a theory that is known to be correct, if not complete. General relativity is a classical theory, operating on the basis of precise measurements of physical quantities, such as distance and time. The very notion of defining an event horizon makes sense as long as we can precisely place this surface and particles around it at perfectly known locations. But quantum mechanical fuzziness requires some positional uncertainty, or an imprecision in energy and time. Physicists are therefore uncomfortable with the idea of a perfectly localized and sealed event horizon, since these notions completely ignore the quantum mechanical uncertainty on the smallest scales.
A phenomenon discovered in 1974 by Stephen Hawking may be the first step in the eventual resolution of this problem.8 The name itself, quantum mechanics, reveals the essence of the physical description on a microscopic scale. It tells us that at this level all measurable entities are to be thought of as comprising tiny bundles (or quanta) of "something," which in the case of light are known as photons. In the appropriate terminology, one says that fluctuations in a field, say the gravitational field, are associated with the manifestation of these quanta, which can appear or vanish as the fluctuations grow or subside. The connection between these bundles and the fuzziness is that their size, energy, and lifetime are directly related to the scale of the imprecision, that is, how fuzzy the measurements of position or energy turn out to be.
Quanta such as photons bubble up spontaneously out of vacuum if an adequate source of energy lies nearby. But a crucial fact that we
8 Readers who would like to learn more about the technical aspects of this phenomenon, and the evaporation of black holes in general, will find the discussion in Thorne, Price, and Macdonald (1986) very helpful. See also Wald (1984).
have gathered from the observed behavior of these fields is that when the bundles materialize spontaneously, they always do so in pairs, as if something must be split in order to create the fluctuation. So a quantum, or particle, with negative charge can only materialize if at the same time its counterpart, with positive charge, also comes into being. Given that every characteristic we can assign to this bundle must be matched by the opposite attributes of its partner particle, it makes sense then to talk of these as particles and antiparticles, or matter and antimatter.
The phenomenon discovered by Hawking9 is directly associated with this creation of quantum particles in vacuum due to fluctuations in the gravitational field of the black hole. Particles created in this way live fleetingly and then annihilate with each other's counterpart to re-establish the vacuum after the fluctuation has subsided. We note, however, that fluctuations in the gravitational field of the black hole have a wavelength commensurate with its size. So when these fluctuations manifest themselves as photons, or any other type of particle whose rest mass is small compared to the amplitude of the fluctuation, their wavelength, too, corresponds to the size of the black hole. The fleeting quanta produced beyond the event horizon of very massive black holes are therefore much redder, and hence of lower energy, than those associated with their smaller brethren.
The paired quanta produced in this fashion annihilate outside the event horizon very quickly (in about one-millionth of a millionth of a millionth of 1 second). But some pairs, argued Hawking, will have a member that dips below the membrane of no return, abandoning its partner to the whim of the outside universe. Without a partner to annihilate, the detached particle flees the black hole's sphere of influence and merges into the flux of escaping radiation headed for infinity. To an observer on Earth, this looks like the black hole is actually radiating, though the mechanism is clearly indirect. Nevertheless, the
9 Some of Hawking's early discussion on this topic appeared in a paper published by Nature in 1974.
source of energy for these fleeing particles is ultimately the black hole itself, and although we cannot claim that the radiation originated from within the event horizon, its energy surely did, and the dark object pays the price with a consequent decrease in its mass. If this simple application of quantum mechanics survives the test of time, it appears that all black holes must evaporate eventually.
The Hawking radiation from a black hole with barely the mass of 30 Suns has such a long wavelength, and is therefore so feeble, that it would take such an object 1061 times the current age of the universe to evaporate completely. But after 1098 years, even the 100-billion-solar-mass behemoths will be gone, completely and forever - the final act of fair play. And thus will end the saga of the most powerful objects in the universe, facing eternity as ghosts in a lifeless darkness.
Was this article helpful?