It is evident that technology of the early 21st century is inadequate for either the relativistic rocket or the starsail. The starsail is closer to current capabilities in principle. Yet this will require spaceflight capabilities and infrastructure far greater than what currently exists. The relativistic rocket does not require massive space infrastructure, but it appeals to exotic physics and engineering that do not exist. In either case the challenges are very daunting. This is a very brief overview of some of the issues that have to be resolved and some potential ways they might be handled.
For a photon sail approach the Fresnel lens, the sailcraft and a space based or lunar based power station must be constructed or assembled in space. This will require many thousands of tons of material for the lens, power station and sailcraft. This is certainly possible in principle, but beyond current abilities. The space shuttle has lofted large payloads into orbit, such as the Hubble Space Telescope and the modules for the ISS spacestation. The photon sail and its space system requirements reflect three or more orders of magnitude increase in mass scale. Further, this has to be accomplished at Lagrange points or in the cis-lunar environment. Hence the space construction problems are much larger than anything accomplished with the space shuttle. Further, a power station must be constructed that is capable to generating up to about 10 times all the power currently generated on Earth. Further, this power is to be used by a huge laser. Further, if this is based on the moon it implies lunar activities on a large scale. So far no human has stepped on the moon since December 1972.
Back in the halcyon days of the space program such infrastructure appeared likely. The space shuttle was advanced as the work horse that would facilitate such space activities. Yet the history of the shuttle has fallen far short of these early expectation. It was advanced as a platform for the launching of satellites, but was not used nearly as extensively as envisioned and the last applied satellite system was deployed in 1993, and the Chandra satellite was the last astronomic system deployed in orbit. Since then the shuttle has not lofted a satellite. Its four missions for the Hubble Space Telescope appeared to have been worth the shuttle flights. While the construction of the ISS spacestation has been a success, this success appears to be one without a real purpose or mission. The crash of the Colombia in 2003 has effectively stranded the ISS. There are current plans for a lunar base, but this plan appears so far to be malformed with no real purpose. This lunar base, if it is established, will literally be a pup-tent compared to what would be required to furnish a power station and laser capable to pushing a sail craft to the stars.
It might be that the recent new space initiative [9.1] will propel developments required to eventually generate the infrastructure required for a starsail. First off it requires that any manned spaceflight and basing of personnel on the moon have some credible scientific purpose. This would mean that lunar missions facilitate astronomical facilities, such as gravity wave detectors and optical interferometers, where astronauts will deploy and maintain equipment in ways that robots or telepresence from Earth cannot. This implies at first intermittent missions to the moon instead of a permanent human presence on the moon. Whether this will lead to a permanent human presence on the moon is uncertain. The problem is that robots and telepresent capabilities are likely to far outpace any advancement in manned spaceflight. This might mean an expanded robotic presence on the moon a lunar powerstation might be constructed and maintained robotically.
The rocket equation is a major impediment for expanded space flight. The fuel and energy requirements are very large. Further, at the early 21st century energy depletion issues are starting to make their presence known. A highly speculative way to solve this problem has been suggested, it is the space elevator. This is a very massive construction that consists of a cable attached to some point on the Earth's surface and rises up past the radius R = 37,000 km for geosynchronous orbit. Beyond this point is a large mass. That this mass is forced to orbit the Earth faster than its orbital speed would by Newton's laws induce a large tension on the whole tether or cable. The tether expands in size and mass as it approaches the massive counter mass beyond geosynchronous orbit. The tether may start out with a diameter of a few centimeters at the Earth's surface and becomes hundreds of kilometers in diameter at geosynchronous orbit. Along this
tether an elevator could lift people and equipment to geosynchronous orbit. It is proposed that high strength nanotube fibers may be used to weave the tether, but so far nanotubes of this sort have not been developed. All of this requires that millions of tons of material be lofted into space to construct this. Of course this has an obviously very "pie in the sky" sound to it. The scale of this is colossal. Its construction would require the use of many rockets, which the space elevator is meant to replace. So it is hard at this time to seriously contemplate the space elevator as a realistic prospect.
This is the challenge for the starsail. It requires a growing capability in space. The recent history of space flight is not a very good precedent for this. The ISS spacestation and space shuttles are slated for cancellation by 2010. NASA's plans for a return to the moon and missions to Mars faces serious budgetary and political problems. It is at best a 50-50 chance whether this program will take shape. Currently public interest in such space programs is nowhere near what they were in the 1960's during the ramp up to the Apollo lunar missions. However, interest in the unmanned programs remains fairly solid, which so far means some future for space science. It has to be admitted that these have had a far greater scientific payoff than manned spaceflight.
The principal difficulty with the relativistic rocket is antimatter. How does the rocket store antimatter, and where is it acquired to begin with? A lot has to be generated. For a spacecraft that would start out at 200 metric tons and burns up half its mass, this requires 50 metric tons of antimatter. This is an energy equivalent of 4.5 x 1021 joules. This requires as much energy as is generated currently over a 100 year period. This of course ignores thermodynamic losses in generating this antimatter. This is obviously a huge problem that would need to be solved.
The problem could be solved only by some exotic physics. In particular if the baryon number can be violated. Such a violation would be p ^ e+ +7. If this could be done then there is no need to generate large amounts of antimatter. The problem is that getting such a violation is not easy. To do this requires study into deeper foundations of physics. Baryons, such as the proton, are composed of quarks. These particles have been confirmed in experiments over the past 30 years. Curiously they are bound in such a way that they can't be released from a proton, or any bound state of quarks [9.2]. Quarks occur in what are called doublets. The first of these doublets is the (d) doublet of the up quark and the down quark. The proton consists of two up quarks and a down quark. The up quark has an electrical charge of |e+ and the down quark has a charge. These quarks are bound to each other by the intermediary bosons of the Quantum ChromoDynamic (QCD) gauge field. The analogue of charges for this force are called colors. The QCD force is likely unified with the electromagnetic field and the weak interaction field, responsible for ¡3 decay in nuclei, so that at high energy these three forces become embedded into a single gauge field. At high energy a quark and a lepton, which carries a unit charge for the weak interaction, can transform into each other. This is similar to the transformations or symmetries discussed with Newtonian mechanics and relativity. This transformation can convert an up quark and a down quark into a positron. Hence a proton may be converted into pure energy by the process where u is the anti-up quark and ve is an anti-electron-neutrino. The bound state between the up quark and the anti-up quark is a neutral meson which is unstable and decays away, essentially into photons. The X boson exists in the grand unified field theory. The problem is that this process only occurs commonly at energy interactions a trillion times larger than what current high energy machines can reach. Further, this boson will still occur in quantum fluctuations and should give a decay rate for the proton. A proton should have a lifetime of 1029 yr, which in principle should be detectable. So far attempts to find the neutrino signature of this decay have been null.
The above describes the SU(5) Grand Unified Theory (GUT), where the designation SU(5) refers to the particular algebraic structure of the above transformations. The failure to detect neutrinos from a proton decay indicates that this particular GUT is probably wrong. However, there are other GUT models. It is likely that on some fundamental level quarks and lep-tons are inter-changable and their respective gauge theories inter-changable within some unified gauge field. So there must be on a fundamental level a way that the proton can be annihilated so as to violate baryon number. However, this may not be practically done with particle physics. If there is the intermediary X boson, as indicated above, it must be in some fashion produced at energies far lower than 1015 GeV, where current particle accelerators are at the 103 GeV scale. Quantum field theory is structured around the renormalization group, which is a way of computing parameters on various scales. The renormalization group has its origins with the regu-
larization process in quantum field theory that eliminates infinities. This is a technical issue that we will not explore, for this book is dedicated largely to classical physics. However, it might be the case that subtleties exist with this that might permit physics at a very high energy scale to be rescaled to lower energy by some process. Of course this is pure speculation, and in fact is most likely wrong.
Another way that protons can be destroyed is with black holes. It might sound strange to suggest black holes here, but tiny quantum black holes might be produced in high energy experiments in the near future. If so it is then possible to convert protons directly into energy. A black hole is known to be the collapsed remnants of a star that imploded under its own gravity after its nuclear fuel was exhausted. Such black holes have been found in relative abundance. At the center of galaxies monstrous black holes have also been found. These large astrophysical black holes are obviously outside our ability to produce. Yet tiny black holes in principle can exist as well, where these might be used to generate energy through the direct conversion of matter to energy.
Karl Schwarzschild derived the first solution to Einstein's general relativity in 1916, while serving as an officer in the German army. His solution was for a static (nonrotating) spherical mass, which was later used to derive the perihelion advance in the orbit of Mercury and the lensing effect of the sun. The latter was confirmed in observations of a solar eclipse off Brazil after the end of World War I. His solution contained the factors a =1 — 2GM/rc2 that enter into the metric line element [9.3]
ds2 = —adt2 + a-1 dr2 + r2(dd2 +sin2(d)d^2). (9.2)
For the sun, the radius is large so that r ^ 2GM/C2 and the curvature near the sun is relatively small. However, in principle this solution exists for r < 2GM/c2. For equality this line element "explodes," but this turns out to be a problem with coordinate choice and may be removed by using a different set of coordinates. For r < 2GM/C2 the time and radial parts of the metric change sign. The inward radial direction assumes the role of time. A particle in this region falls inward in much the same way that we progress forward inexorably into the future. Hence escape from inside this region of the black hole is impossible. The line element also diverges to "infinity" as r ^ 0. This singularity turns out to be real, and not an illusion of coordinate choice.
Schwarzschild died of an illness on the Russian front in 1916, and so never saw his solution employed to confirm a prediction of general relativity.
This solution aroused some controversy, but it was generally thought that the curious properties with the Schwarzschild radius r = 2GM/c2 were a mathematical defect and not physically real. Such mathematical defects occur in physics, and physicists most often ignore them. However, in 1939 Robert Oppenheimer found that this solution did predict that a star which collapses to within the Schwarzschild radius will cease to be observable and its evolution beyond this point is unpredictable. He also found that the singularity at the center is physically real, or at least real within classical physics. This was demonstrated by showing that any material that could resist further collapse in the region r < 2GM/C2 would have to propagate pressure waves at a speed greater than light. As r ^ 0 the curvature of spacetime diverges, as does the tidal force on any extended body. Anything that enters the black hole is completely destroyed. As a rule physicists do not like such divergences. Yet here this divergence is surrounded by the Schwarzschild radial limit of observability, called an event horizon, and so this divergence is not observable.
Further mathematical work in general relativity demonstrated that a black hole is characterized by only three physical quantities, its mass, angular momentum and charge. All other physical characteristics of matter and energy that composed the black hole are irretrievably lost. There is a measure of controversy over this, but for our purposes we will say these properties are either destroyed or so completely scrambled as to be erased. This is the "no hair" result [9.3]: no properties of matter that constituted the black hole are observable. Of course if one were to observe something falling towards the black hole it would appear to slow down and any clock on it would be slowed as it progresses towards the event horizon. By corollary light emitted by any object is also redshifted, and become so redshifted as the light source moves towards the horizon so as to make it disappear. So the black hole is a gravity pit that causes things to enter it to leave the observable universe, as seen by any observer falling into the black hole, and to the outside observer things falling into the black hole are redshifted out of view. This also means that a black hole will only acquire mass and not lose mass. Hence the black hole is a self growing eater of matter, destroying anything that enters it.
Black holes have been identified in the universe. Though the black hole is invisible its gravity field exerts effects on its environment. Most often a black hole results from the implosion of a stellar core from a star that explodes off its outer layers in a supernova event. Often these black holes orbit another star, where if this orbit is tight enough material from the star is gravitationally pulled into the black hole. This material becomes ever more heated and energetic as it spirals towards the black hole in an accretion disk. This material emits X-ray radiation that is a signature for the black hole. A number of these have been identified. Mounting evidence indicates that the centers of galaxies contain black holes that have masses equal to several billion solar masses.
There is a bit of a problem with the black hole. A black hole that absorbs a piece of matter with a temperature appears to bury it away, along with the temperature of the mass. This is then a violation of the laws of thermodynamics. A black hole is unable to permit anything from its interior shrouded by the event horizon from escaping. This means that the temperature of a black hole might be thought to be zero. The disappearance of a chunk of mass, with some temperature and entropy, is a violation of the second law of thermodynamics. Further, if the black hole has a zero temperature this is a violation of the third law of thermodynamics. So something is amiss with the black hole. Jacob Bekenstein looked at the black hole as something composed of some indistinguishable set of particles to compute the entropy and temperature of a black hole. His result is rather stunning, for it implies that this gravity sink into oblivion must then emit radiation in order to have a temperature. The resolution of one problem raises another question.
Stephen Hawking demonstrated theoretically how the temperature of a black hole was due to quantum radiance. A black hole emits quanta of radiation spontaneously in much the same way that a nucleus may spontaneously decay. The quanta of radiation emitted by a black hole are photons that quantum mechanically tunnel out of the hole. This tunnelling is an aspect of the Heisenberg uncertainty principle that a spread of energy multiplied by a spread in time, usually the time interval of a measurement, is equal to the unit of action h = h/2n. Similarly a particle localized in a volume may spontaneously appear elsewhere with some spread of momentum. Thus a tiny unit of mass in a black hole may appear away from the black hole and escape as radiation. This mechanism implies that a black hole will then over time quantum mechanically decay away. The lifetime for a black hole is found to be T = G2m3/hc4, which the average solar mass black hole is T ~ 6.6 x 1077 sec or 2.1 x 1070 years. This of course assumes that the black hole is not absorbing matter. A black hole of a billion grams will decay away in about .1 sec. This continues down to T ~ 10~43 sec for a Planck mass black hole. A Planck mass black hole is one who's Planck wavelength is equal to its Schwarzschild radius. For a A = mc/h and r = A = 2GM/c2
it may be demonstrated that a Planck unit of a black hole has a radius Lp = y/Gh/c3 = 1.6 x 10~33 cm and a mass 2.2 x 10~5 g. The energy required to probe this region is 1019 GeV, which is 16 orders of magnitude greater than what high energy machines are capable of.
The quantum radiance of black holes has resulted in a number of questions central to the nature of quantum mechanics. To understand this on a deeper level various theories within the field of string theory have been advanced to account for the proper statistics for black hole decay. One development which has emerged is that the black hole might emerge at energy much lower than expected. Some theoretical work suggests that so called soft black holes might emerge at energy accessible to the LHC accelerator to be started at Geneva in 2007. I have worked some theory which suggests that the Planck scale might be renormalized to larger scales so that soft black holes might also emerge [9.4][9.5]. Further, this might be associated with the Higgs field. If these or related theories should turn out to be realistic then a proton could in principle be absorbed by this black hole and converted to a positron and other particle-antiparticle pairs or photons.
It is then possible that a matter to energy conversion machine might be developed. This is required to make the relativistic rocket possible. Such a rocket would have to carry such a converter to generate the large amounts of energy required to achieve 7 =12. Of course this assumes that nature is cooperative with our theories, and even if so that this could be made into a working device. If such technology is developed in the later 21si century or in the 22nd it will be the ultimate source of energy, surpassing anything that nuclear energy could muster. If such a device could be made compact enough then a modest sized relativistic rocket could be constructed to study another star system.
This page intentionally left blank
Was this article helpful?