John F. Donoghue

Department of Physics, University of Massachusetts

Each field has a set of questions which are universally viewed as important, and these questions motivate much of the work in the field. In particle physics, several of these questions are directly related to experimental problems. Examples include questions such as: Does the Higgs boson exist and, if so, what is its mass? What is the nature of the dark matter seen in the Universe? What is the mechanism that generated the net number of baryons in the Universe? For these topics, there is a well posed problem related to experimental findings or theoretical predictions. These are problems that must be solved if we are to achieve a complete understanding of the fundamental theory.

There also exists a different set of questions which have a more aesthetic character. In these cases, it is not as clear that a resolution is required, yet the problems motivate a search for certain classes of theories. Examples of these are the three 'naturalness' or 'fine-tuning' problems of the Standard Model; these are associated with the cosmological constant A, the energy scale of electroweak symmetry-breaking v and the strong CP-violating angle d. As will be explained more fully below, these are free parameters in the Standard Model that seem to have values 10 to 120 orders of magnitude smaller than their natural values and smaller than the magnitude of their quantum corrections. Thus their 'bare' values plus their quantum corrections need to be highly fine-tuned in order to obtain the observed values. Because of the magnitude of this fine-tuning, one suspects that there is a dynamical mechanism at work that makes the fine-tuning natural. This motivates many of the theories of new physics beyond the Standard Model. A second set of aesthetic problems concern the parameters of the Standard Model, i.e. the coupling constants and masses of

Universe or Multiver.se?, ed. Bernard Carr. Published by Cambridge University Press. © Cambridge University Press 2007.

the theory. While the Standard Model is constructed simply using gauge symmetry, the parameters themselves seem not to be organized in any symmetric fashion. We would love to uncover the principle that organizes the quark and lepton masses (sometimes referred to as the 'flavour problem'), for example, but attempts to do so with symmetries or a dynamical mechanism have been unsuccessful.

These aesthetic questions are very powerful motivations for new physics. For example, the case for low energy supersymmetry, or other TeV scale dynamics to be uncovered at the Large Hadron Collider (LHC), is based almost entirely on the fine-tuning problem for the scale of electroweak symmetry-breaking. If there is new physics at the TeV scale, then there need not be any fine-tuning at all and the electroweak scale is natural. We are all greatly looking forward to the results of the LHC, which will tell us if there is in fact new physics at the TeV scale. However, the aesthetic questions are of a different character from direct experimental ones concerning the existence and mass of the Higgs boson. There does not have to be a resolution to the aesthetic questions - if there is no dynamical solution to the fine-tuning of the electroweak scale, it would puzzle us, but would not upset anything within the fundamental theory. We would just have to live with the existence of fine-tuning. However, if the Higgs boson is not found within a given mass range, it would falsify the Standard Model.

The idea of a multiverse will be seen to change drastically the way in which we perceive the aesthetic problems of fine-tuning and flavour. In a multiverse, the parameters of the theory vary from one domain to another. This naturally leads to the existence of anthropic constraints - only some of these domains will have parameters that reasonably allow the existence of life. We can only find ourselves in a domain which satisfies these anthropic constraints. Remarkably, the anthropic constraints provide plausible 'solutions' to two of the most severe fine-tuning problems: those of the cosmological constant and the electroweak scale. Multiverse theories also drastically reformulate some of the other problems - such as the flavour problem. However, at the same time, these theories raise a new set of issues for new physics. My purpose in this chapter is to discuss how the idea of the multiverse reformulates the problems of particle physics.

It should be noted up front that the Anthropic Principle [1-3] has had a largely negative reputation in the particle physics community. At some level this is surprising - a community devoted to uncovering the underlying fundamental theory might be expected to be interested in exploring a suggestion as fundamental as the Anthropic Principle. I believe that the problem really lies in the word 'Principle' more than in the word 'Anthropic'.

The connotation of 'Principle' is that of an underlying theory. This leads to debates over whether such a principle is scientific, i.e. whether it can be tested. However, 'anthropics' is not itself a theory, nor even a principle. Rather, the word applies to constraints that naturally occur within the full form of certain physical theories. However, it is the theory itself that needs to be tested, and to do this one needs to understand the full theory and pull out its predictions. For theories that lead to a multiverse, anthropic constraints are unavoidable. As we understand better what types of theory have this multiverse property, the word anthropic is finding more positive applications in the particle physics community. This article also tries to describe some of the ways that anthropic arguments can be used to positive effect in particle physics.

The Lagrangian of the Standard Model (plus General Relativity) encodes our present understanding of all observed physics except for dark matter [4]. The only unobserved ingredient of the theory is the Higgs boson. The Standard Model is built on the principle of gauge symmetry - that the Lagrangian has an SU(3) ® SU(2)l ® U(1) symmetry at each point of spacetime. This, plus renormalizability, is a very powerful constraint and uniquely defines the structure of the Standard Model up to a small number of choices, such as the number of generations of fermions. General Relativity is also defined by a gauge symmetry - local coordinate invariance. The resulting Lagrangian can be written in compact notation:

Experts recognize the various terms here as indications of the equations governing the photon, gluons and W-bosons (the F2 terms), quarks and leptons (the V terms), the Higgs field (0) and gravity (R), along with a set of interactions constrained by the gauge symmetry. Of course, such a simple form belies a very complex theory, and tremendous work is required to understand the predictions of the Standard Model. But the greatest lesson of particle physics of the past generation is that nature organizes the Universe through a simple set of gauge symmetries.

However, the story is not complete. The simple looking Lagrangian given by Eq. (15.1), and the story of its symmetry-based origin, also hide a far less beautiful fact. To really specify the theory, we need not only the Lagrangian, but also a set of twenty-eight numbers which are the parameters of the theory. These are largely hidden underneath the compact notation of the Lagrangian. Examples include the masses of all the quarks and leptons (including neutrinos), the strengths of the three gauge interactions, the weak mixing angles describing the charge current interactions of quarks and lep-tons, the overall scale of the weak interaction, the cosmological constant and Newton's gravitational constant. None of these parameters is predicted by the theory. The values that have have been uncovered experimentally do not obey any known symmetry pattern, and the Standard Model provides no principle by which to organize them. After the beauty of the Standard Model Lagrangian, these seemingly random parameters reinforce the feeling that there is more to be understood.

Three of the twenty-eight parameters are especially puzzling, because their values appear to be unnaturally small. Naturalness and fine-tuning have very specific technical meanings in particle physics. These meanings are related to, but not identical to, the common usage in non-technical settings. The technical version is tied to the magnitude of quantum corrections. When one calculates the properties of any theory using perturbation theory, quantum mechanical effects give additive corrections to all its parameters. Perturbation theory describes the various quantities of a theory as a power series in the coupling constants. The calculation involves summing over the effects of all virtual states that are possible in the theory, including those at high energy. The quantum correction refers to the terms in the series that depend on the coupling constants. The 'bare' value is the term independent of the coupling constants. The physical measured value is the sum of the bare value and the quantum corrections.

The concept of naturalness is tied to the magnitude of the quantum corrections. If the quantum correction is of the same order as (or smaller than) the measured value, the result is said to be natural. If, on the contrary, the measured value is much smaller than the quantum correction, then the result is unnatural because the bare value and the quantum correction appear to have an unexpected cancellation to give a result that is much smaller than either component. This is an unnatural fine-tuning.

In fact, the quantum correction is often not precisely defined. The ambiguity can arise due to possible uncertainties of the theory at high energy. Since physics is an experimental science, and we are only gradually uncovering the details of the theory as we probe higher energies, we do not know the high energy limits of our present theory. We expect new particles and interactions to be uncovered as we study higher energies. Since the quantum correction includes effects from high energy, there is an uncertainty about their extent and validity. We understand the theory up to some energy - let us call this Emax - but beyond this new physics may enter. The quantum corrections will typically depend on the scale Emax. We will see below that, in some cases, the theory may be said to be natural if one employs low values of Emax but becomes unnatural for high values.

The Higgs field in the Standard Model takes a constant value everywhere in spacetime. This is called its 'vacuum expectation value', abbreviated as vev, which has the magnitude v = 246 GeV. This is the only dimensionful constant in the electroweak interactions and hence sets the scale for all dimensionful parameters of the electroweak theory. For example, all of the quark and lepton masses are given by dimensionless numbers r (the Yukawa couplings) times the Higgs vev, mi = riv^\/2. However, the Higgs vev is one of the parameters which has a problem with naturalness. While it depends on many parameters, the problem is well illustrated by its dependence on the Higgs coupling to the top quark. In this case, the quantum correction grows quadratically with Emax. One finds

where rt is the Yukawa coupling for the top quark, v0 is the bare value, A is the self-coupling of the Higgs and the second term is the quantum correction. Since v = 246 GeV and rt ~ A ~ 1, this would be considered natural if Emax ~ 103 GeV, but it would be unnatural by twenty-six orders of magnitude if Emax ~ 1016 GeV (characteristic of the Grand Unified Theories which unite the electroweak and strong interactions) or thirty-two orders of magnitude if E max ^ 1019 GeV (characteristic of the Planck mass, which sets the scale for quantum gravity).

If we philosophically reject fine-tuning and require that the Standard Model be technically natural, this requires that Emax should be around 1 TeV. For this to be true, we need a new theory to enter at this scale that removes the quadratic dependence on Emax in Eq. (15.2). Such theories do exist - supersymmetry is a favourite example. Thus the argument against fine-tuning becomes a powerful motivator for new physics at the scale of 1 TeV. The LHC has been designed to find this new physics.

An even more extreme violation of naturalness involves the cosmological constant A. Experimentally, this dimensionful quantity is of order A — (10"3 eV)4. However, the quantum corrections to it grow as the fourth power of the scale Emax:

with the constant c being of order unity. This quantity is unnatural for all particle physics scales by a factor of 1048 for Emax — 103 GeV to 10124 for Emax - 1019 GeV.

It is unlikely that there is a technically natural resolution to the cosmo-logical constant's fine-tuning problem - this would require new physics at 10"3 eV. A valiant attempt at such a theory is being made by Sundrum [5], but it is highly contrived to have new dynamics at this extremely low scale which modifies only gravity and not the other interactions.

Finally, there is a third classic naturalness problem in the Standard Model - that of the strong CP-violating parameter 9. It was realized that QCD can violate CP invariance, with a free parameter 9 which can, in principle, range from zero up to 2n. An experimental manifestation of this CP-violating effect would be the existence of a non-zero electric dipole moment for the neutron. The experimental bound on this quantity requires 9 < 10"10. The quantum corrections to 9 are technically infinite in the Standard Model if we take the cut-off scale Emax to infinity. For this reason, we would expect that 9 is a free parameter in the model of order unity, to be renormalized in the usual way. However, there is a notable difference from the two other problems above in that, if the scale Emax is taken to be very large, the quantum corrections are still quite small. This is because they arise only at a very high order in perturbation theory. So, in this case, the quantum corrections do not point to a particular scale at which we expect to find a dynamical solution to the problem.

The standard response to the fine-tuning problems described above is to search for dynamical mechanisms that explain the existence of the fine-tuning. For example, many theories for physics beyond the Standard Model (such as supersymmetry, technicolour, large extra dimensions, etc.) are motivated by the desire to solve the fine-tuning of the Higgs vev. These are plausible, but as yet have no experimental verification. The fine-tuning problem for the cosmological constant has been approached less successfully; there are few good suggestions here. The strong CP problem has motivated the theory of axions, in which an extra symmetry removes the strong CP violation, but requires a very light pseudo-scalar boson - the axion - which has not yet been found.

However, theories of the multiverse provide a very different resolution of the two greatest fine-tuning problems, that of the Higgs vev and the cosmological constant. This is due to the existence of anthropic constraints on these parameters. Suppose for the moment that life can only arise for a small range of values of these parameters, as will be described below. In a multiverse, the different domains will have different values of these parameters. In some domains, these parameters will fall in the range that allows life. In others, they will fall outside this range. It is then an obvious constraint that we can only observe those values that fall within the viable range. For the cosmological constant and the Higgs vev, we can argue that the anthropic constraints only allow parameters in a very narrow window, all of which appears to be fine-tuned by the criteria of Section 15.3. Thus the observed fine-tuning can be thought to be required by anthropic constraints in multiverse theories.

The first application of anthropic constraints to explain the fine-tuning of the cosmological constant - even before this parameter was known to be non-zero - was due to Linde [6] and Weinberg [7]; see also refs. [8-10]. In particular, Weinberg gave a physical condition - noting that, if the cosmo-logical constant was much different from what it is observed to be, galaxies could not have formed. The cosmological constant is one of the ingredients that governs the expansion of the Universe. If it had been of its natural scale of (103 GeV)4, the Universe would have collapsed or been blown apart (depending on the sign) in a fraction of a second. For the Universe to expand slowly enough that galaxies can form, A must lie within roughly an order of magnitude of its observed value. Thus the 10124 orders of magnitude of fine-tuning is spurious; we would only find ourselves in one of the rare domains with a tiny value of the cosmological constant.

Other anthropic constraints can be used to explain the fine-tuning of the Higgs vev. In this case, the physical constraint has to do with the existence of atoms other than hydrogen. Life requires the complexity that comes from having many different atoms available to build viable organisms. It is remarkable that these atoms do not exist for most values of the Higgs vev, as has been shown by my collaborators and myself [11,12]. Suppose for the moment that all the parameters of the Standard Model are held fixed, except for v which is allowed to vary. As v increases, all of the quark masses grow, and hence the neutron and proton masses also increase. Likewise, the neutron-proton mass-splitting increases in a calculable fashion. The most model-independent constraint on v then comes from the value when the neutron-proton mass-splitting becomes larger than the 10 MeV per nucleon that binds the nucleons into nuclei; this occurs when v is about five times the observed value. When this happens, all bound neutrons will decay to protons [11,12]. However, a nucleus of only protons is unstable and will fall apart into hydrogen. Thus complex nuclei will no longer exist.

A tighter constraint takes into account the calculation of the nuclear binding energy, which decreases as v increases. This is because the nuclear force, especially the central isoscalar force, is highly dependent on pion exchange and, as v increases, the pion mass also increases, making the force of shorter range and weaker. In this case, the criteria for the existence of heavy atoms require v to be less than a few times its observed value. Finally, a third constraint - of comparable strength - comes from the need to have deuterium stable, because deuterium was involved in the formation of the elements in primordial and stellar nucleosynthesis [11,12]. In general, even if the other parameters of the Standard Model are not held fixed, the condition is that the weak and strong interactions must overlap. The masses of quarks and leptons arise in the weak interactions. In order to have complex elements, some of these masses must be lighter than the scale of the strong interactions and some heavier. This is a strong and general constraint on the electroweak scale. All of these constraints tell us that the viable range for the Higgs vev is not the thirty or so orders of magnitude described above, but only the tiny range allowed by anthropic constraints.

While anthropic constraints have the potential to solve the two greatest fine-tuning problems of the Standard Model, similar ideas very clearly fail to explain the naturalness problem of the strong CP-violating parameter 9 [4]. For any possible value of 9 in the allowed range from 0 to 2n, there would be little influence on life. The electric dipole moments that would be generated could produce small shifts in atomic energy levels but would not destabilize any elements. Even if a mild restriction could be found, there would be no logical reason why 9 should be as small as 10_10. Therefore the idea of a multiverse does nothing to solve this fine-tuning problem.

The lack of an anthropic solution to this problem is a very strong constraint on multiverse theories. It means that, in a multiverse ground state that satisfies the other anthropic constraints, the strong CP problem must generically be solved by other means. Perhaps the axion option, which appears to us to be an optional addition to the Standard Model, is in fact required to be present for some reason - maybe in order to generate dark matter in the Universe. Or perhaps there is a symmetry that initially sets d to zero, in which case the quantum corrections shift it only by a small amount. This can be called the 'small infinity' solution, because - while the quantum correction is formally infinite - it is small when any reasonable cut-off is used. Thus the main problem in this solution is to find a reason why the bare value of d is zero rather than some number of order unity. In any case, in multiverse theories the strong CP problem appears more serious than the other fine-tuning problems and requires a dynamical solution.1

The above discussion can be viewed as a motivation for multiverse theories. Such theories would provide an explanation of two of the greatest puzzles of particle physics. However, this shifts the focus to the actual construction of such physical theories. So far we have just presented a 'story' about a multiverse. It is a very different matter to construct a real physical theory that realizes this story.

The reason that it is difficult to construct a multiverse theory is that most theories have a single ground state, or at most a small number of ground states. It is the ground state properties that determine the parameters of the theory. For example, the Standard Model has a unique ground state, and the value of the Higgs vev in that state determines the overall scale for the quark masses etc. Sometimes theories with symmetries will have a set of discretely different ground states, but generally just a few. The utility of the multiverse to solve the fine-tuning problems requires that there be very many possible ground states. For example, if the cosmological constant has a fine-tuning problem of a factor of 1050, one would expect that one needs of order 1050 different ground states with different values of the cosmological constant in order to have the likelihood that at least one of these would fall in the anthropically allowed window.

In fact, such theories do exist, although they are not the norm. There are two possibilities: one where the parameters vary continuously and one where they vary in discrete steps. In the former case, the variation of the parameters in space and time must be described by a field. Normally such a field would settle into the lowest energy state possible, but there is a mechanism whereby the expansion of the Universe 'freezes' the value of the field and does not let it relax to its minimum [14-16]. However, since

1 Chapter 3 of this volume by Wilczek, [13] suggests a possible anthropic explanation in the context of inflationary models for why 0 should be very small.

the present expansion of the Universe is very small, the forces acting on this field must be exceptionally tiny. There is a variant of such a theory which has been applied to the fine-tuning of the cosmological constant. However, it has proven difficult to extend this theory to the variation of other parameters.

A more promising type of multiverse theory appears to be emerging from string theory. This originates as a 10- or 11-dimensional theory, although in the end all but four of the spacetime dimensions must be rendered unobserv-able to us, for example by being of very tiny finite size. Most commonly, the extra dimensions are 'compact', which means that they are of finite extent but without an endpoint, in the sense that a circle is compact. However, solutions to string theory seem to indicate that there are very many low energy solutions which have different parameters, depending on the size and shape of the many compact dimensions [17-21]. In fact, there are so many that one estimate puts the number of solutions that have the properties of our world - within the experimental error bars for all measured parameters -as of order 10100. There would then be many more parameters outside the possible observed range. In this case, there are astonishingly many possible sets of parameters for solutions to string theory. This feature of having fantastically many solutions to string theory, in which the parameters vary as you move through the space of solutions, is colloquially called the 'landscape'.

There are two key properties of these solutions. The first is that they are discretely different and not continuous [22]. The different states are described by different field values in the compact dimensions. These field values are quantized, because they need to return to the same value as one goes around the compact dimension. With enough fields and enough dimensions, the number of solutions rapidly becomes extremely large.

The second key property is that transitions between the different solutions are known [23-25]. This can occur when some of the fields change their values. From our 4-dimensional point of view, what occurs is that a bubble nucleates, in which the interior is one solution and the exterior is another one. The rate for such nucleations can be calculated in terms of string theory parameters. In particular, it apparently always occurs during inflation or at finite temperature. Nucleation of bubbles commonly leads to large jumps in the parameters, such as the cosmological constant, and the steps do not always go in the same direction.

These two properties imply that a multiverse is formed in string theory if inflation occurs. There are multiple states with different parameters, and transitions between these occur during inflation. The outcome is a universe in which the different regions - the interior of the bubble nucleation regions -have the full range of possible parameters.

String theorists long had the hope that there would be a unique ground state of the theory. It would indeed be wonderful if one could prove that there is only one true ground state and that this state leads to the Standard Model, with exactly the parameters seen in nature. It would be hard to imagine how a theory with such a high initial symmetry could lead only to a world with parameters with as little symmetry as seen in the Standard Model, such as mu = 4 MeV, md = 7 MeV, etc. But if this were in fact shown, it would certainly prove the validity of string theory. Against this hope, the existence of a landscape and a multiverse seems perhaps disappointing. Without a unique ground state, we cannot use the prediction of the parameters as a proof of string theory.

However, there is another sense in which the string theory landscape is a positive development. Some of us who are working 'from the bottom up' have been led by the observed fine-tuning (in both senses of the word) to desire the existence of a multiverse with exactly the properties of the string theory landscape. From this perspective, the existence of the landscape is a strong motivation in favour of string theory, more immediate and pressing even than the desire to understand quantum gravity.

Inflation also seems to be a necessary ingredient for a multiverse [26-28]. This is because we need to push the boundaries between the domains far outside our observable horizon. Inflation neatly explains why we see a mostly uniform universe, even if the greater multiverse has multiple different domains. The exponential growth of the scale factor during inflation makes it reasonable that we see a uniform domain. However, today inflation is the 'simple' ingredient that we expect really does occur, based on the evidence of the flatness of the universe and the power spectrum of the cosmic microwave background temperature fluctuations. It is the other ingredient of the multiverse proposal - having very many ground states - that is much more difficult.

Let us be philosophical for a moment. Anthropic arguments and invocations of the multiverse can sometimes border on being non-scientific. You cannot test for the existence of other domains in the Universe outside the one visible to us - nor can you find a direct test of the Anthropic Principle. This leads some physicists to reject anthropic and multiverse ideas as being outside of the body of scientific thought. This appears to me to be unfair. Anthropic consequences appear naturally in some physical theories. However, there are nevertheless non-trivial limitations on what can be said in a scientific manner in such theories.

The resolution comes from the realization that neither the anthropic nor the multiverse proposal constitutes a concrete theory. Instead there are real theories, such as string theory, which have a multiverse property and lead to our domain automatically satisfying anthropic constraints. These are not vague abstractions, but real physical consequences of real physical theories. In this case, the anthropic and multiverse proposals are not themselves a full theory but rather the output of such a theory. Our duty as scientists is not to give up because of this but to find other ways to test the original theory. Experiments are reasonably local and we need to find some reasonably local tests that probe the original full theory.

However, it has to be admitted that theories with a multiverse property, such as perhaps the string landscape - where apparently 'almost anything goes' - make it difficult to be confident of finding local tests. Perhaps there are some consequences which always emerge from string theory for all states in the landscape. For example, one might hope that the bare strong CP-violating 9 angle is always zero in string theory and that it receives only a small finite renormalization. However, other consequences would certainly be of a statistical nature that we are not used to. An example is the present debate as to whether supersymmetry is broken at low energy or high energy in string theory. It is likely that both possibilities are present, but the number of states of one type is likely to be very different (by factors of perhaps 10100) from the number of states of the other type - although it is not presently clear which is favoured. If this is solved, it will be a good statistical prediction of string theory. If we can put together a few such statistical predictions, we can provide an effective test of the theory.

Of the parameters of the Standard Model, none are as confusing as the masses of the quarks and leptons. From the history of the periodic table and atomic/nuclear spectroscopy, we would expect that the masses would show some pattern that reveals the underlying physics. However, no such pattern has ever been found. In this section, I will describe a statistical pattern, namely that the masses appear randomly distributed with respect to a scale-invariant weight, and I will discuss how this can be the probe of a multiverse theory.

quarks

leptons

Fig. 15.1. The quark and lepton masses on a log scale. The result appears to be qualitatively consistent with a random distribution in ln m, and quantitative analysis bears this out.

In a multiverse or in the string theory landscape, one would not expect the quark and lepton masses to exhibit any pattern. Rather, they would be representative of one of the many possible states available to the theory. Consider the ensemble of ground states which have the other parameters of the Standard Model held fixed. In this ensemble, the quark and lepton masses are not necessarily uniformly distributed. Rather we could describe their distribution by some weight [29,30]. For example, perhaps this weight favours quarks and leptons with small masses, as is in fact seen experimentally. We would then expect that the quark masses seen in our domain are not particularly special but are typical of a random distribution with respect to this weight.

The quark masses appear mostly at low energy, yet extend to high energy. To pull out the range of weights that could lead to this distribution involves a detailed study of their statistical properties. Yet it is remarkably easy to see that they are consistent with being scale-invariant. A scale-invariant weight means that the probability of finding the masses in an interval dm at any mass m scales as dm/m. This in turn means that the masses should be randomly distributed when plotted as a function of ln m. It is easy to see visually that this is the case; Fig. 15.1 shows the quark and lepton masses plotted on a logarithmic scale. One can readily see that this is consistent with being a random distribution. The case for a scale-invariant distribution can be quantified by studying the statistics of six or nine masses distributed with various weights [30]. When considering power-law weights of the form dm/m5, one can constrain the exponent 5 to be greater than 0.8. The scale-invariant weight (5 = 1) is an excellent fit. One may also discuss the effects of anthropic constraints on the weights [30].

What should we make of this statistical pattern? In a multiverse theory, this pattern is the visible remnant of the underlying ensemble of ground states of different masses. An example of how this distribution could appear from a more fundamental theory is given by the Intersecting Brane Worlds solutions of string theory [31,32]. In these solutions, our 4-dimensional world appears as the intersection of solutions (branes) of higher dimension, much as a 1-dimensional line can be described as the intersection of two 2-dimensional surfaces. In these theories, the quark and lepton masses are determined by the area between three intersections of these surfaces. In particular, the distribution is proportional to the exponential of this area, m ~ e"^. In a string landscape there might not be a unique area, but rather a distribution of areas. The mathematical connection is that, if these areas are distributed uniformly (i.e. with a constant weight), then the masses are distributed with a scale-invariant weight. In principle, the distribution of areas is a calculation that could be performed when we understand string theory better. Thus, we could relate solutions of string theory to the observed distribution of masses in the real world. This illustrates how we can test the predictions of a multiverse theory without a unique ground state.

The idea of a multiverse can make positive contributions to particle physics. In a multiverse, some of our main puzzles disappear, but they are replaced by new questions.

We have seen how the multiverse can provide a physical reason for some of the fine-tuning that seems to be found in nature. We have also stressed that two distinct meanings of the phrase 'fine-tuning' are used in different parts of the scientific literature. One meaning, often encountered in discussions of anthropic considerations, relates to the observation that the measured parameters seem to be highly tuned to the narrow window that allows life to exist. The other meaning is the particle physics usage described above, which concerns the relative size of the quantum corrections compared with the measured value. The latter usage has no a priori connection to the former. However, the idea of the multiverse unites the two uses - the requirement of life limits the possible range of the particle physics parameters and can explain why the measured values are necessarily so small compared with the quantum effects.

However, in other cases, the multiverse makes the problems harder. The strong CP problem is not explained by the multiverse. It is a clue that a dynamical solution to this problem has to be a generic feature of the underlying full theory.

The flavour problem of trying to understand the properties of the quarks and leptons also becomes reformulated. I have described how the masses appear to be distributed in a scale-invariant fashion. In a multiverse theory, it is possible that this is a reflection of the dynamics of the underlying theory and that this feature may someday be used as a test of the full theory.

We clearly have more to discover in particle physics. In answering the pressing experimental questions on the existence of the Higgs boson and the nature of dark matter etc., we will undoubtably learn more about the underlying theory. We also hope that the new physics that emerges will shed light on aesthetic questions concerning the Standard Model. The idea of the multiverse is a possible physical consequence of some theories of physics beyond the Standard Model. It has not been heavily explored in particle physics, yet presents further challenges and opportunities. We clearly have more work to do before we can assess how fruitful this idea will be for the theory of the fundamental interactions.

I am pleased to thank my collaborators on these topics, Steve Barr, Dave Seckel, Thibault Damour, Andreas Ross and Koushik Dutta, as well as my long-term collaborator on more sensible topics, Gene Golowich, for discussions that have helped shape my ideas on this topic. My work has been supported in part by the US National Science Foundation and by the John Templeton Foundation.

References

[1] J. Barrow and F. Tipler. The Anthropic Cosmological Principle (Oxford: Clarendon Press, 1986).

[2] C.J. Hogan. Why the universe is just so. Rev. Mod. Phys. 72 (2000), 1149 [astro-ph/9909295].

[3] R. N. Cahn. The eighteen arbitrary parameters of the standard model in your everyday life. Rev. Mod. Phys. 68 (1996), 951.

[4] J. F. Donoghue, E. Golowich and B. R. Holstein. Dynamics of the Standard Model (Cambridge: Cambridge University Press, 1992).

[5] R. Sundrum. Towards an effective particle-string resolution of the cosmological constant problem. JHEP, 9907 (1999), 001 [hep-ph/9708329].

[6] A. Linde. Inflation and quantum cosmology. In Results and Perspectives in Particle Physics, ed. M.Greco (Gif-sur-Yvette, France: Editions Frontieres, 1989), p. 11.

[7] S. Weinberg. Theories of the cosmological constant. In Critical Dialogues in Cosmology, ed. N. Turok (Singapore: World Scientific, 1997).

[8] H. Martel, P. R. Shapiro and S. Weinberg. Likely values of the cosmological constant. Astrophys. J., 492 (1998), 29 [astro-ph/9701099].

[9] T. Banks, M. Dine and L. Motl. On anthropic solutions of the cosmological constant problem. JHEP, 0101 (2001), 031 [hep-th/0007206].

[10] J. D. Bjorken. Standard model parameters and the cosmological constant. Phys. Rev. D 64 (2001), 085008 [hep-ph/0103349].

V. Agrawal, S. M. Barr, J. F. Donoghue and D. Seckel. Anthropic considerations in multiple-domain theories and the scale of electroweak symmetry breaking. Phys. Rev. Lett. 80 (1998), 1822 [hep-ph/9801253]. V. Agrawal, S. M. Barr, J. F. Donoghue and D. Seckel. The anthropic principle and the mass scale of the standard model. Phys. Rev. D57 (1998), 5480 [hep-ph/9707380]. F. Wilczek. This volume (2007).

J. Garriga and A. Vilenkin. On likely values of the cosmological constant. Phys. Rev. D 61 (2000), 083502 [astro-ph/9908115].

J. Garriga and A. Vilenkin. Solutions to the cosmological constant problems. Phys. Rev. D 64 (2001), 023517 [hep-th/0011262].

J. F. Donoghue. Random values of the cosmological constant. JHEP 0008 (2000), 022 [hep-ph/0006088].

M. R. Douglas. The statistics of string M theory vacua. JHEP, 03 05 ( 2003), 046 [hep-th/0303194].

S. Ashok and M. R. Douglas. Counting flux vacua. JHEP, 0401 (2004), 060 [hep-th/0307049].

L. Susskind. This volume (2007) [hep-th/0302219].

T. Banks, M. Dine and E. Gorbatov. Is there a string theory landscape?

R. Kallosh and A. Linde. M-theory, cosmological constant and anthropic principle. Phys. Rev. D 67 (2003), 023510 [hep-th/0208157]. R. Bousso and J. Polchinski. Quantization of four-form fluxes and dynamical neutralization of the cosmological constant. JHEP, 00 06 ( 2000), 006 [hep-th/0004134].

J. D. Brown and C. Teitelboim. Neutralization of the cosmological constant by membrane creation. Nucl. Phys. B 297 (1988), 787.

J. D. Brown and C. Teitelboim. Dynamical neutralization of the cosmological constant. Phys. Lett. B 195 (1987), 177.

J. F. Donoghue. Dynamics of M-theory vacua. Phys. Rev. D 69 (2004), 106012; erratum 129901 [hep-th/0310203].

A. D. Linde. Eternally existing self-reproducing chaotic inflationary universe. Phys. Lett. B 175 (1986), 395.

A. D. Linde. Eternal chaotic inflation. Mod. Phys. Lett. A 1 (1986), 81. A. H. Guth. Inflation and eternal inflation. Phys. Rep. 333 (2000), 555 [astro-ph/0002156].

J. F. Donoghue. The weight for random quark masses. Phys. Rev. D 57 (1998), 5499 [hep-ph/9712333].

J. F. Donoghue, K. Dutta and A. Ross. Quark and lepton masses in the landscape. Phys. Rev. D 73 (2006), 113002.

D. Cremades, L. E. Ibanez and F. Marchesano. Towards a theory of quark masses, mixings and CP-violation. (2002) [hep-ph/0212064]. D. Cremades, L. E. Ibanez and F. Marchesano. Yukawa couplings in intersecting D-brane models. JHEP, 0307 (2003), 038 [hep-th/0302105].

Was this article helpful?

## Post a comment