Article

Naturalness and Emergence

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

I develop an account of naturalness (that is, approximately: lack of extreme fine-tuning) in physics which demonstrates that naturalness assumptions are not restricted to narrow cases in high-energy physics but are a ubiquitous part of inter-level relations are derived in physics. After exploring how and to what extent we might justify such assumptions on methodological grounds or through appeal to speculative future physics, I consider the apparent failure of naturalness in cosmology and in the Standard Model. I argue that any such naturalness failure threatens to undermine the entire structure of our understanding of inter-theoretic reduction, and so risks a much larger crisis in physics than is sometimes suggested; I briefly review some currently-popular strategies that might avoid that crisis.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Recently, however, the effective field theory (EFT) framework in quantum field theory (QFT) has provided resources for articulating a wide range of intertheoretic relations in physics, and a conceptual template for how to think about intertheoretic relations in science more generally. The renormalization group provides a mechanism for relating distinct EFTs, and provides a compelling formalism for weak emergence that is compatible with a precise, quantitative account of reduction (Wallace 2019;Knox and Wallace 2023). Further, one can use the renormalization group scaling behavior-a central ingredient in the EFT framework-to make estimates about where our current best theories will break down. ...
... For these reasons, there has been growing philosophical interest in the EFT perspective on QFTs. Philosophers have recently argued that the EFT perspective forces us to reconsider our traditional understanding of interpreting physical theories, including consequences for scientific realism, theory semantics, and the nature of intertheoretic relations (Dougherty 2023;Franklin 2020;Koberinski and Fraser 2023;Miller 2021;Rivat and Grinbaum 2020;Rivat 2021;Wallace 2019;Williams 2019). These are often far-reaching claims made about understanding scientific theories as a whole, though the focus is usually on particle physics and the QFTs used there. ...
... Giudice (2008;, the head of the theoretical physics division at CERN and a former proponent of naturalness, has since argued that particle physics has entered the post-naturalness era, and that this is something of a Kuhnian crisis period for the discipline. Wallace (2019) has similarly argued that giving up on naturalness principles generally would lead to problems for thinking about how physical theories relate to each other even outside the scope of particle physics. Indeed, naturalness principles might seem indispensable to the success of reasoning in physics. ...
Article
Full-text available
Effective field theory (EFT) is a computationally powerful theoretical framework, finding application in many areas of physics. The framework, applied to the Standard Model of particle physics, is even more empirically successful than our theoretical understanding would lead us to expect. I argue that this is a problem for our understanding of how the Standard Model relates to some successor theory. The problem manifests as two theoretical anomalies involving relevant parameters: the cosmological constant and the Higgs mass. The persistent failure to fix these anomalies from within suggests that the way forward is to go beyond the EFT framework.
... In the last 50 years a principle called 'Naturalness' has become an important criterion guiding high energy physics research (Rosaler & Harlander, 2019;Williams, 2018). Many different formulations of the Naturalness principle exists, but here I will follow Wallace in defining Naturalness as the requirement that the fundamental constants of nature should be selected from a probability distribution which is Natural, in the sense that it can be specified by some 'not-ridiculously complicated function, relative to the uniform distribution' (Wallace, 2019). ...
... That is, typically when we effect a reduction from macrodynamics to microdynamics, implicit in that reduction is the assumption that the initial conditions and fundamental constants are drawn from Natural probability distributions. This assumption is needed because different Natural distributions over the underlying parameters will lead to different values of the macroscopic parameters but the same qualitative form for the macroscopic dynamics, so 'all of the information about the world's dynamics is encoded in the lowest-level dynamics, extractable from those dynamics through the assumption of Naturalness' (Wallace, 2019), and this makes formal reductions of the macrodynamics to the microdynamics comparatively straightforward to achieve. Whereas an un-Natural distribution over the initial condition and/or fundamental constants can potentially lead to a completely different qualitative form for the macroscopic dynamics, so with un-Natural distributions we can't derive the macrodynamics just from the microdynamics plus a generic assumption like Naturalness; the dynamics will be partly encoded in highly specific details of the initial state or values of parameters. ...
... But in fact, the values of some of the fundamental constants are known at least approximately, and according to current knowledge the values of the constants are not Natural. There are at least two cases in which two of the constants appear to be very 'fine-tuned' (Wallace, 2019;Williams, 2018)-which is to say, their values cancel out in a precise way such that a term which would normally have been expected to grow rapidly as the renormalization group flow moves towards large distances instead remain very small. In one case, the value of the cosmological constant at the cutoff scale ( 0 ) and another constant depending on vacuum fluctuations (v) appear to almost cancel out such that the cosmological constant at macroscopic scales ( M ) is much smaller than it would otherwise have been expected to be; in another case, the value of the bare Higgs mass (m H ) and quantum corrections coming from all the other standard model particles (c) appear to almost cancel out to result in the observed value of the Higgs mass at the scales we are able to access, which again is much smaller than we would have expected based on calculations of the quantum corrections alone. ...
Article
Full-text available
I suggest that the current situation in quantum field theory (QFT) provides some reason to question the universal validity of ontological reductionism. I argue that the renormalization group flow is reversible except at fixed points, which makes the relation between large and small distance scales quite symmetric in QFT, opening up at least the technical possibility of a non-reductionist approach to QFT. I suggest that some conceptual problems encountered within QFT may potentially be mitigated by moving to an alternative picture in which it is no longer the case that the large supervenes on the small. Finally, I explore some specific models in which a form of non-reductionism might be implemented, and consider the prospects for future development of these models.
... According to 't Hooft, the concept is related to the concept of symmetry: a "physical parameter or set of parameters α i (μ) is allowed to be very small only if the replacement α i (μ) = 0 would increase the symmetry of the system" (1980,136). For detailed philosophical explanations of these and other concepts of naturalness and the relation between them see Williams (2015Williams ( , 2019, Wallace (2019), Rosaler (2022), and Fischer (2024a). These discussions are prima facie independent of metaphysical concepts of naturalness or natural properties (Lewis, 1983;Dorr, 2024). ...
... Similar concerns arise in the context of the naturalness principle. Evidential support for the naturalness principle comes from the observation that the principle is realized in many instances, and that naturalness did help or at least could have helped predicting new phenomena in other places (Wallace, 2019;Bain, 2019). Underlying this is an inductive argument to the effect that one should also expect naturalness in places where it is not known to hold. ...
Article
Full-text available
Guiding principles are central to theory development in physics, especially when there is only limited empirical input available. Here I propose an approach to such principles looking at their heuristic role. I suggest a distinction between two modes of employing scientific principles. Principles of nature make descriptive claims about objects of inquiry, and principles of epistemic action give directives for further research. If a principle is employed as a guiding principle, then its use integrates both modes of employment: guiding principles imply descriptive claims, and they provide directives for further research. By discussing the correspondence principle and the naturalness principle as examples, I explore the consequences for understanding and evaluating current guiding principles in physics. Like principles of nature, guiding principles are evaluated regarding their descriptive implications about the research object. Like principles of epistemic action, guiding principles are evaluated regarding their ability to respond to context-specific needs of the epistemic agent.
... Many different formulations of the Naturalness principle exists, but here I will follow Wallace in defining Naturalness as the requirement that the fundamental constants of nature should be selected from a probability distribution which is Natural, in the sense that it can be specified by some 'notridiculously complicated function, relative to the uniform distribution.' (Wallace, 2019) 3.2.1 Is Naturalness Important? ...
... That is, typically when we effect a reduction from macrodynamics to microdynamics, implicit in that reduction is the assumption that the initial conditions and fundamental constants are drawn from Natural probability distributions. This assumption is needed because different Natural distributions over the underlying parameters will lead to different values of the macroscopic parameters but the same qualitative form for the macroscopic dynamics, so 'all of the information about the world's dynamics is encoded in the lowest-level dynamics, extractable from those dynamics through the assumption of Naturalness' (Wallace, 2019), and this makes formal reductions of the macrodynamics to the microdynamics comparatively straightforward to achieve. Whereas an un-Natural distribution over the initial condition and/or fundamental constants can potentially lead to a completely different qualitative form for the macroscopic dynamics, so with un-Natural distributions we can't derive the macrodynamics just from the microdynamics plus a generic assumption like Naturalness; the dynamics will be partly encoded in highly specific details of the initial state or values of parameters. ...
Preprint
Full-text available
I suggest that the current situation in quantum field theory (QFT) provides some reason to question the universal validity of ontological reductionism. I argue that the renormalization group flow is reversible except at fixed points, which makes the relation between large and small distance scales quite symmetric in QFT, opening up at least the technical possibility of a non-reductionist approach to QFT. I suggest that some conceptual problems encountered within QFT may potentially be mitigated by moving to an alternative picture in which it is no longer the case that the large supervenes on the small. Finally, I explore some specific models in which a form of non-reductionism might be implemented, and consider the prospects for future development of these models.
... This has already happened in the recent past, albeit with sometimes drastic, consequences for physics and metaphysics. Wallace [40], for instance, warns that the rejection of the principle of naturalness implies a rejection of reductionism as a overarching principle of foundational research. In this article, however, an opposite perspective -a positive consequence of rejecting the principle of naturalness -will be argued for, shedding an encouraging light on the naturalness debate about the Higgs mass (condensed in H 2 ). ...
Article
Full-text available
We provide novel, metatheoretical arguments strengthening the position that the naturalness problem of the light Higgs mass is a pseudo-problem: Under one assumption, no physics beyond the standard model of particle physics is needed to explain the small value of the Higgs boson. By evaluating previous successes of the guiding principle of technical naturalness, we restrict its applicability to non-fundamental phenomena in the realm of provisional theories within limited energy scales. In view of further breaches of autonomy of scales in apparently fundamental phenomena outside particle physics, the hierarchy problem of the Higgs mass is instead reinterpreted as an indication of the ontologically fundamental status of the Higgs boson. Applying the concept of robustness of theoretical elements under theory changes by Worrall and Williams justifies this seemingly contradictory attribution within the effective theories of the standard model of particle physics. Moreover, we argue that the ongoing naturalness debate about the Higgs mass is partly based on the adherence to the methodology of effective theories (often claimed to be universally applicable), for which there is no justification when dealing with presumably fundamental phenomena such as the Higgs mechanism, even if it is embedded into an effective theory.
... Chapter 3 will condense the insights from the previous chapter into two theses. First, the four criteria 1 Contributions to the naturalness debate include (Giudice 2008, Wells 2013Dine 2015, Hossenfelder 2017, Williams 2015, Wallace 2019, Bain 2019, Franklin 2020, Koren 2020, Fischer 2023, Branahl 2024). ...
Preprint
In recent years, criticism of the methodology of particle physics beyond the Standard Model has increased, diagnosing too much reliance on aesthetic criteria for theory development and evaluation. Faced with several decades of experimental confirmation of all theories lacking, we subject four aesthetic criteria - simplicity, symmetry, elegance, and inevitability -, regularly mentioned in theory evaluation, to critical examination. We find that these criteria, all of which can be reduced to a desire for simplicity, have repeatedly misled modern particle physics. This is largely due to the lack of metatheoretical permanence of a uniform conception of simplicity. The reductionist claim of particle physics - the search for simple fundamental principles in a complex world - will be worked out as the reason why this discipline is particularly susceptible to the aesthetic appeal of simplicity. Thus, compared to disciplines dealing with complex phenomena, aesthetic criteria are much more frequently applied, exposing particle physics to the risk of missteps and dead ends.
... Others are the cosmological constant problem[102] and the strong CP problem discussed above.19 For a philosophical perspective on the failure of naturalness, see[147]. ...
Article
Full-text available
The discovery of the Higgs boson in 2012 at CERN completed the experimental confirmation of the Standard Model particle spectrum. Current theoretical insights and experimental data are inconclusive concerning the expectation of future discoveries. While new physics may still be within reach of the LHC or one of its successor experiments, it is also possible that the mass of particles beyond those of the Standard Model is far beyond the energy reach of any conceivable particle collider. We thus have to face the possibility that the age of “on-shell discoveries” of new particles may belong to the past and that we may soon witness a change in the scientists' perception of discoveries in fundamental physics. This article discusses the relevance of this questioning and addresses some of its potential far-reaching implications through the development, first, of a historical perspective on the concept of particle. This view is prompt to reveal important specificities of the development of particle physics. In particular, it underlines the close relationship between the evolution of observational methods and the understanding of the very idea of particle. Combining this with an analysis of the current situation of high-energy physics, this leads us to the suggestion that the particle era in science must undergo an important conceptual reconfiguration.
... This is a notion of rather limited scope, since the precise definitions of decoupling here apply to local, Lagrangian field theories, whose high-energy formulation must be renormalizable [78], although the guiding principle of decoupling is meant to generalize this to all effective field theories. Ref. [79] makes a strong case that this form of naturalness is foundational to the way we currently work in physics and to how we understand theoretical relations such as emergence within physics. From Wallace's perspective, at least, it would appear that naturalness plays a role similar to the Copernican principle as a sort of regulative principle for physics. ...
Article
Full-text available
The (re)introduction of Λ into cosmology has spurred debates that touch on central questions in philosophy of science, as well as the foundations of general relativity and particle physics. We provide a systematic assessment of the often implicit philosophical assumptions guiding the methodology of precision cosmology in relation to dark energy. We start by briefly introducing a recent account of scientific progress in terms of risky and constrained lines of inquiry. This allows us to contrast aspects of Λ that make it relevantly different from other theoretical entities in science, such as its remoteness from direct observation or manipulability. We lay out a classification for possible ways to explain apparent accelerated expansion but conclude that these conceptually clear distinctions may blur heavily in practice. Finally, we consider the important role played in cosmology by critical tests of background assumptions, approximation techniques, and core principles, arguing that the weak anthropic principle fits into this category. We argue that some core typicality assumptions—such as the Copernican principle and the cosmological principle—are necessary though not provable, while others—such as the strong anthropic principle and appeals to naturalness or probability in the multiverse—are not similarly justifiable.
... Among philosophers of science there is now a vital theoretical debate about the credibility of naturalness as a guiding principle in particle physics. On the one hand, there are authors who argue that assumptions of naturalness are deeply entrenched in physics (Williams 2015;Wallace 2019). For these contributors the absence of BSM physics poses a deep challenge to established forms of reasoning in particle physics and beyond. ...
Article
Full-text available
It has been suggested that particle physics has reached the “dawn of the post-naturalness era.” I explain the current shift in particle physicists’ attitude towards naturalness. I argue that the naturalness principle was perceived to be supported by theories it has inspired. The potential coherence between major beyond the Standard Model (BSM) proposals and the naturalness principle led to an increasing degree of credibility of the principle among particle physicists. The absence of new physics at the Large Hadron Collider (LHC) has undermined the potential coherence and has led to the principle’s loss of significance.
... The dynamic production picture on which Laplacean determinism is predicated encourages us to take a particular attitude to initial conditions, regarding them as freely chosen inputs to an otherwise fully determined system. Thus in particular the Laplacean picture makes a sharp distinction between the freely chosen initial conditions and the values of any parameters which enter into the theory (e.g. the masses of fundamental particles, the gravitational constant and so on), which are normally taken to to be nomic (Wallace (2019)). But in the constraint framework the case for this different status is significantly weaker. ...
Article
Full-text available
Physicists are increasingly beginning to take seriously the possibility of laws outside the traditional time-evolution paradigm; yet many popular definitions of determinism are still predicated on a time-evolution picture, making them manifestly unsuited to the diverse range of research programmes in modern physics. In this article, we use a constraint-based framework to set out a generalization of determinism which does not presuppose temporal evolution, distinguishing between strong, weak and delocalised holistic determinism. We discuss some interesting consequences of these generalized notions of determinism, and we show that this approach sheds new light on the long-standing debate surrounding the nature of objective chance.
... Arguing against this position, recent work has taken the EFT framework and the RG to give rise to a new prospective realism and a foundation for emergence as separate from reduction (Crowther 2015;J. D. Fraser 2018;Williams 2019;Wallace 2019). Meanwhile other work has focused on the assumptions needed to set up the EFT framework (Williams 2015;Rivat 2020;Koberinski and Smeenk 2022). ...
Article
The effective field theory (EFT) perspective on particle physics has yielded insight into the Standard Model. This paper investigates the epistemic consequences of the use of different variants of renormalization group (RG) methods as part of the EFT perspective on particle physics. RG methods are a family of formal techniques. While the semi-group variant of the RG has played a prominent role in condensed matter physics, the full-group variant has become the most widely applicable formalism in particle physics. We survey different construction techniques for EFTs in particle physics and analyze the role that semi-group and full-group variants of the RG play in each. We argue that the full-group variant is best suited to answering structural questions about relationships among EFTs at different scales, as well as explanatory questions, such as why the Standard Model has been empirically successful at low energy scales and why renormalizability was a successful criterion for constructing the Standard Model. We also present an account of EFTs in particle physics that is based on the full-RG. Our conclusion about the advantages of the full-RG is restricted to the particle physics case. We argue that a domain-specific approach to interpreting EFTs and RG methods is needed. Formal variations and flexibility in physical interpretation enable RG methods to support different explanatory strategies in condensed matter and particle physics. In particular, it is consistent to maintain that coarse-graining is an essential component of explanations in condensed matter physics, but not in particle physics.
... 37 For scepticism about the significance of the fine-tuning see (Bianchi and Rovelli 2010;Hossenfelder 2018); for anthropic considerations see (Polchinski 2006); see also (Wallace 2019b) and references therein. of the 'Minkowskian approximation' which I appealed to above when applied to even gently curved spacetimes. I find both objections unpersuasive. ...
Article
I provide a conceptually-focused presentation of ‘low-energy quantum gravity’ (LEQG), the effective quantum field theory obtained from general relativity and which provides a well-defined theory of quantum gravity at energies well below the Planck scale. I emphasize the extent to which some such theory is required by the abundant observational evidence in astrophysics and cosmology for situations which require a simultaneous treatment of quantum-mechanical and gravitational effects, contra the often-heard claim that all observed phenomena can be accounted for either by classical gravity or by non-gravitational quantum mechanics, and I give a detailed account of the way in which a treatment of the theory as fluctuations on a classical background emerges as an approximation to the underlying theory rather than being put in by hand. I discuss the search for a Planck-scale quantum-gravity theory from the perspective of LEQG and give an introduction to the Cosmological Constant problem as it arises within LEQG.
... 37 For scepticism about the significance of the fine-tuning see (Bianchi and Rovelli 2010;Hossenfelder 2018); for anthropic considerations see (Polchinski 2006); see also (Wallace 2019b) and references therein. of the 'Minkowskian approximation' which I appealed to above when applied to even gently curved spacetimes. I find both objections unpersuasive. ...
Preprint
Full-text available
I provide a conceptually-focused presentation of `low-energy quantum gravity' (LEQG), the effective quantum field theory obtained from general relativity and which provides a well-defined theory of quantum gravity at energies well below the Planck scale. I emphasize the extent to which some such theory is required by the abundant observational evidence in astrophysics and cosmology for situations which require a simultaneous treatment of quantum-mechanical and gravitational effects, \emph{contra} the often-heard claim that all observed phenomena can be accounted for either by classical gravity or by non-gravitational quantum mechanics, and I give a detailed account of the way in which a treatment of the theory as fluctuations on a classical background emerges as an approximation to the underlying theory rather than being put in by hand. I discuss the search for a Planck-scale quantum-gravity theory from the perspective of LEQG and give an introduction to the Cosmological Constant problem as it arises within LEQG.
... However, it is worth pointing out one connection to the philosophy of statistical mechanics. Here the deviant microstates can be ruled out by a background condition, which stipulates an initial microstate is 'Natural' or simple in a particular way (Wallace, 2019). ...
Article
In what sense are the special sciences autonomous of fundamental physics? Autonomy is an enduring theme in discussions of the relationship between the special sciences and fundamental physics or, more generally, between higher and lower-level facts. Discussion of ‘autonomy’ often fails to recognise that autonomy admits of degrees; consequently, autonomy is either taken to require full independence, or risk relegation to mere apparent autonomy. In addition, the definition of autonomy used by Fodor, the most famous proponent of the autonomy of the special sciences, has been robustly criticised by Loewer. In this paper I develop a new account of autonomy following Woodward (2018) which I dub ‘generalised autonomy’ since it unifies dynamical, causal and nomic autonomy. Autonomy, on this account, can be partial: some lower-level details matter while others do not. To summarise: whilst the detailed lower level is unconditionally relevant, conditionalising on the higher-level facts renders some lower-level details irrelevant. The macrodependencies that the higher-level facts enter into — be they dynamical, causal or nomic — screen off the underlying microdetails. This account helps resolve an explanatory puzzle: if the lower-level facts in some way underpin the higher-level facts, why don’t the lower-level details matter more for the day-to-day practice of the special sciences? The answer will be: the facts uncovered by the special sciences are autonomous in my sense, and so practitioners of these special sciences need not study more fundamental sciences, since these underlying facts are genuinely (albeit conditionally) irrelevant.
... For further discussion of naturalness understood as inter-scale autonomy, see[9,13,31,34].Content courtesy of Springer Nature, terms of use apply. Rights reserved. ...
Article
Full-text available
The earliest formulation of the Higgs naturalness argument has been criticized on the grounds that it relies on a particular cutoff-based regularization scheme. One response to this criticism has been to circumvent the worry by reformulating the naturalness argument in terms of a renormalized, regulator-independent parametrization. An alternative response is to deny that regulator dependence poses a problem for the naturalness argument, because nature itself furnishes a particular, physically correct regulator for any effective field theory (EFT) in the form of that EFT’s physical cutoff, together with an associated set of bare parameters that constitute the unique physically preferred “fundamental parameters” of the EFT. Here, I argue that both lines of defense against the initial worry about regulator dependence are flawed. I argue that reformulation of the naturalness argument in terms of renormalized parameters simply trades dependence on a particular regularization scheme for dependence on a particular renormalization scheme, and that one or another form of scheme dependence afflicts all formulations of the Higgs naturalness argument. Concerning the second response, I argue that the grounds for suspending the principle of regularization or renormalization scheme independence in favor of a physically preferred parametrization are thin; the assumption of a physically preferred parametrization, whether in the form of bare “fundamental parameters” or renormalized “physical parameters,” constitutes a theoretical idle wheel in generating the confirmed predictions of established EFTs, which are invariably scheme-independent. I highlight certain features of the alternative understanding of EFTs, and the EFT-based approach to understanding the foundations of QFT, that emerges when one abandons the assumption of a physically preferred parametrization. I explain how this understanding departs from several dogmas concerning the mathematical formulation and physical interpretation of EFTs in high-energy physics.
... Indeed, this approach has the interesting consequence of dissolving the distinction between initial conditions and parameter values. Wallace has observed that the physical content of a theory has three aspects: the qualitative form of its dynamical equations (in which the coefficients are unspecified parameters); the actual, numerical values of the parameters (expressed as dimensionless ratios); and the initial conditions [39]. As Wallace notes, it is common to consider that the parameter values are 'lawlike' while the initial conditions are merely 'contingent' -that is, parameters and initial conditions are usually thought to have importantly different modal status. ...
Preprint
Full-text available
Physicists are increasingly beginning to take seriously the possibility of laws outside the traditional time-evolution paradigm; yet our understanding of determinism is still predicated on a forwards time-evolution picture, making it manifestly unsuited to the diverse range of research programmes in modern physics. In this article, we use a constraint-based framework to set out a generalization of determinism which does not presuppose temporal directedness, distinguishing between strong, weak and hole-free global determinism. We discuss some interesting consequences of these generalized notions of determinism, and we show that this approach sheds new light on the long-standing debate surrounding the nature of objective chance.
... After an extensive discussion of the properties of an invariant measure including demonstrating it has to be a function of the mechanical energy, however, Gibbs did not attempt to derive the canonical distribution; rather he simply stated that an exponential form "seems to represent the most simple case conceivable". of X(t) when Z(t) ∈ I, if we have more information of Z(t) outside of I, we are able to seek a deeper understanding of the original problem. Not only for the canonical ensemble, this idea of treating a given constraint (parameter) as a variable with distribution has also been widely used in many other fields, for example, comparing the quenched and annealed invariance principles for the random conductance model [3], and in studying the initial-condition naturalness in the case of statistical mechanics [41]. ...
Article
Full-text available
The probability distribution of a function of a subsystem conditioned on the value of the function of the whole, in the limit when the ratio of their values goes to zero, has a limit law: It equals the unconditioned marginal probability distribution weighted by an exponential factor whose exponent is uniquely determined by the condition. We apply this theorem to explain the canonical equilibrium ensemble of a system in contact with a heat reservoir. Since the theorem only requires analysis at the level of the function of the subsystem and reservoir, it is applicable even without the knowledge of the composition of the reservoir itself, which extends the applicability of the canonical ensemble. Furthermore, we generalize our theorem to a model with strong interaction that contributes an additional term to the exponent, which is beyond the typical case of approximately additive functions. This result is new in both physics and mathematics, as a theory for the Gibbs conditioning principle for strongly correlated systems. A corollary provides a precise formulation of what a temperature bath is in probabilistic terms.
... In other words, even though the original problem is only about the behavior of X(t) when Z(t) ∈ I, if we have more information of Z(t) outside of I, we are able to seek a deeper understanding of the original problem. Not only for the canonical ensemble, this idea of treating a given constraint (parameter) as a variable with distribution has been also widely used in many of other fields, for example, in comparison of quenched and annealed invariance principles for random conductance model [3], and in studying of initial-condition naturalness in the case of statistical mechanics [30]. ...
Preprint
The probability distribution of an additive function of a subsystem conditioned on the value of the function of the whole, in the limit of the ratio of their values goes to zero, has a limit law: It equals to the unconditioned probability distributions weighted by an exponential factor whose exponent is uniquely determined by the condition. We apply this theorem to explain the canonical equilibrium ensemble of a system in contact with a heat reservoir. A corollary provides a precise formulation of what a temperature bath is in probabilistic terms.
Article
Full-text available
The mathematical centerpiece of many physical theories is a Lagrangian. So let’s imagine that there’s some Lagrangian we trust. Should that induce us to endorse an ontology? If so, what ontology, and how is it related to our trustworthy Lagrangian? I’ll examine these questions in the context of quantum field theoretic Lagrangians. When these Lagrangians are understood as “merely effective,” a variety of approximations figure in the physics they frame. So do distinctive grounds for trusting those Lagrangians, grounds recent literature has adduced in support of a novel variety of scientific realism known as Effective Realism. This essay attempts to undermine those grounds, and to do so without presupposing extensive prior knowledge of quantum field theories.
Chapter
Tired starlings. The web of cause and effect. Emergence: the effective self as a local causal domain. Emergence and causality: Where has all the physics gone? The self and others. The captive scapegoat and the birth of tragedy. The second revolution.
Preprint
Contra Dardashti, Th\'ebault, and Winsberg (2017), this paper defends an analysis of arguments from analogue simulations as instances of a familiar kind of inductive inference in science: arguments from material analogy (Hesse, Models and Analogies in Science, Univ Notre Dame Press, 1963). When understood in this way, the capacity of analogue simulations to confirm hypotheses about black holes can be deduced from a general account - fully consistent with a Bayesian standpoint - of how ordinary arguments from material analogy confirm. The proposed analysis makes recommendations about what analogue experiments are worth pursuing that are more credible than Dardashti, Hartmann, Th\'ebault, and Winsberg's (2019). It also offers a more solid basis for addressing the concerns by Crowther, Linneman, and W\"utrich (2019), according to which analogue simulations are incapable of sustaining hypotheses concerning black hole radiation.
Preprint
Full-text available
The (re)introduction of Λ\Lambda into cosmology has spurred debates that touch on central questions in philosophy of science, as well as the foundations of general relativity and particle physics. We provide a systematic assessment of the often implicit philosophical assumptions guiding the methodology of precision cosmology in relation to dark energy. We start by briefly introducing a recent account of scientific progress in terms of risky and constrained lines of inquiry. This allows us to contrast aspects of Λ\Lambda that make it relevantly different from other theoretical entities in science, such as its remoteness from direct observation or manipulability. We lay out a classification for possible ways to explain apparent accelerated expansion but conclude that these conceptually clear distinctions may blur heavily in practice. Finally, we consider the important role played in cosmology by critical tests of background assumptions, approximation techniques, and core principles, arguing that the weak anthropic principle fits into this category. We argue that some core typicality assumptions -- like the Copernican principle and the cosmological principle -- are necessary though not provable, while others -- like the strong anthropic principle and appeals to naturalness or probability in the multiverse -- are not similarly justifiable.
Article
Full-text available
Many philosophers of science are ontologically committed to a lush rainforest of special science entities (Ross (2000)), but are often reticent about the criteria that determine which entities count as real. On the other hand, the metaphysics literature is much more forthcoming about such criteria, but often links ontological commitment to irreducibility. We argue that the irreducibility criteria are in tension with scientific realism: for example, they would exclude viruses, which are plausibly theoretically reducible and yet play a sufficiently important role in scientific accounts of the world that they should be included in our ontology. In this paper, we show how the inhabitants of the rainforest can be inoculated against the eliminative threat of reduction: by demonstrating that they are emergent. According to our account, emergence involves a screening off condition as well as novelty. We go on to demonstrate that this account of emergence, which is compatible with theoretical reducibility, satisfies common intuitions concerning what should and shouldn't count as real: viruses are emergent, as are trouts and turkeys, but philosophically gerrymandered objects like trout-turkeys do not qualify.
Article
Full-text available
My aim in this paper is twofold: (i) to distinguish two notions of naturalness employed in Beyond the Standard Model (BSM) physics and (ii) to argue that recognizing this distinction has methodological consequences. One notion of naturalness is an ``autonomy of scales'' requirement: it prohibits sensitive dependence of an effective field theory's low-energy observables on precise specification of the theory's description of cutoff-scale physics. I will argue that considerations from the general structure of effective field theory provide justification for the role this notion of naturalness has played in BSM model construction. A second, distinct notion construes naturalness as a statistical principle requiring that the values of the parameters in an effective field theory be ``likely'' given some appropriately chosen measure on some appropriately circumscribed space of models. I argue that these two notions are historically and conceptually related but are motivated by distinct theoretical considerations and admit of distinct kinds of solution.
Article
Full-text available
We critically analyze the rationale of arguments from finetuning and naturalness in particle physics and cosmology, notably the small values of the mass of the Higgs-boson and the cosmological constant. We identify several new reasons why these arguments are not scientifically relevant. Besides laying out why the necessity to define a probability distribution renders arguments from naturalness internally contradictory, it is also explained why it is conceptually questionable to single out assumptions about dimensionless parameters from among a host of other assumptions. Some other numerological coincidences and their problems are also discussed.
Article
Full-text available
Both bottom-up and top-down causation occur in the hierarchy of structure and causation. A key feature is multiple realizability of higher level functions, and consequent existence of equivalence classes of lower level variables that correspond to the same higher level state. Five essentially different classes of top-down influence can be identified, and their existence demonstrated by many real-world examples. They are: algorithmic top-down causation; top-down causation via non-adaptive information control, top-down causation via adaptive selection, top-down causation via adaptive information control and intelligent top-down causation (the effect of the human mind on the physical world). Through the mind, abstract entities such as mathematical structures have causal power. The causal slack enabling top-down action to take place lies in the structuring of the system so as to attain higher level functions; in the way the nature of lower level elements is changed by context, and in micro-indeterminism combined with adaptive selection. Understanding top-down causation can have important effects on society. Two cases will be mentioned: medical/healthcare issues, and education-in particular, teaching reading and writing. In both cases, an ongoing battle between bottom-up and top-down approaches has important consequences for society.
Article
Full-text available
The expansion of the observed universe appears to be accelerating. A simple explanation of this phenomenon is provided by the non-vanishing of the cosmological constant in the Einstein equations. Arguments are commonly presented to the effect that this simple explanation is not viable or not sufficient, and therefore we are facing the "great mystery" of the "nature of a dark energy". We argue that these arguments are unconvincing, or ill-founded.
Article
Full-text available
An important contemporary version of Boltzmannian statistical mechanics explains the approach to equilibrium in terms of typicality. The problem with this approach is that it comes in different versions, which are, however, not recognized as such and not clearly distinguished. This article identifies three different versions of typicality‐based explanations of thermodynamic‐like behavior and evaluates their respective successes. The conclusion is that the first two are unsuccessful because they fail to take the system's dynamics into account. The third, however, is promising. I give a precise formulation of the proposal and present an argument in support of its central contention.
Article
Full-text available
great mystery to me that it was so generally ignored. (Bell 1987, 191) According to orthodox quantum theory, the complete description of a system of particles is provided by its wave function. This statement is somewhat problematical: If "particles" is intended with its usual meaning---point-like entities whose most important feature is their position in space---the statement is clearly false, since the complete description would then have to include these positions; otherwise, the statement is, to be charitable, vague. Bohmian mechanics is the theory that emerges when we indeed insist that "particles" means particles. According to Bohmian mechanics, the complete description or state of an N-particle system is provided by its wave function /(q; t), where q = (q 1 ; : : : ; qN ) 2 IR 3N ; and its confi
Article
Full-text available
. In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the time-reversible laws of microscopic physics. Boltzmann's analysis, the essence of which I shall review here, is basically correct. The most famous criticisms of Boltzmann's later work on the subject have little merit. Most twentieth century innovations -- such as the identification of the state of a physical system with a probability distribution # on its phase space, of its thermodynamic entropy with the Gibbs entropy of #, and the invocation of the notions of ergodicity and mixing for the justification of the foundations of statistical mechanics -- are thoroughly misguided. 1
Article
Full-text available
In order to arrive at Bohmian mechanics from standard nonrelativistic quantum mechanics one need do almost nothing! One need only complete the usual quantum description in what is really the most obvious way: by simply including the positions of the particles of a quantum system as part of the state description of that system, allowing these positions to evolve in the most natural way. The entire quantum formalism, including the uncertainty principle and quantum randomness, emerges from an analysis of this evolution. This can be expressed succinctly---though in fact not succinctly enough---by declaring that the essential innovation of Bohmian mechanics is the insight that {\it particles move }!
Article
Through extended consideration of two wide classes of case studies - dilute gases and linear systems - I explore the ways in which assumptions of probability and irreversibility occur in contemporary statistical mechanics, where the latter is understood as primarily concerned with the derivation of quantitative higher-level equations of motion, and only derivatively with underpinning the equilibrium concept in thermodynamics. I argue that at least in this wide class of examples, (i) irreversibility is introduced through a reasonably well-defined initial-state condition which does not precisely map onto those in the extant philosophical literature; (ii) probability is explicitly required both in the foundations and in the predictions of the theory. I then consider the same examples, as well as the more general context, in the light of quantum mechanics, and demonstrate that while the analysis of irreversibility is largely unaffected by quantum considerations, the notion of statistical-mechanical probability is entirely reduced to quantum-mechanical probability.
Article
In an imaginary conversation with Guido Altarelli, I express my views on the status of particle physics beyond the Standard Model and its future prospects.
Article
I expand on some ideas from my recent review "String theory to the rescue," I discuss my use of Bayesian reasoning. I argue that it can be useful but that it is very far from the central point of the discussion. I then review my own personal history with the multiverse. Finally I respond to some of the criticisms of Ellis and Silk, which initiated this interesting discussion.
Book
Recent developments in cosmology and particle physics, such as the string landscape picture, have led to the remarkable realization that our universe - rather than being unique - could be just one of many universes. The multiverse proposal helps to explain the origin of the universe and some of its observational features. Since the physical constants can be different in other universes, the fine-tunings which appear necessary for the emergence of life may also be explained. Nevertheless, many physicists remain uncomfortable with the multiverse proposal, since it is highly speculative and perhaps untestable. In this volume, a number of active and eminent researchers in the field - mainly cosmologists and particle physicists but also some philosophers - address these issues and describe recent developments. The articles represent the full spectrum of views, providing for the first time an overview of the subject. They are written at different academic levels, engaging lay-readers and researchers alike.
Article
Article
The search for a theory of quantum gravity faces two great challenges: the incredibly small scales of the Planck length and time, and the possibility that the observed constants of nature are in part the result of random processes. A priori, one might have expected these to be insuperable obstacles. However, clues from observed physics, and the discovery of string theory, raise the hope that the unification of quantum mechanics and general relativity is within reach.
Book
Nancy Cartwright argues for a novel conception of the role of fundamental scientific laws in modern natural science. If we attend closely to the manner in which theoretical laws figure in the practice of science, we see that despite their great explanatory power these laws do not describe reality. Instead, fundamental laws describe highly idealized objects in models. Thus, the correct account of explanation in science is not the traditional covering law view, but the ‘simulacrum’ account. On this view, explanation is a matter of constructing a model that may employ, but need not be consistent with, a theoretical framework, in which phenomenological laws that are true of the empirical case in question can be derived. Anti‐realism about theoretical laws does not, however, commit one to anti‐realism about theoretical entities. Belief in theoretical entities can be grounded in well‐tested localized causal claims about concrete physical processes, sometimes now called ‘entity realism’. Such causal claims provide the basis for partial realism and they are ineliminable from the practice of explanation and intervention in nature.
Article
The recent discovery of the Higgs at 125 GeV by the ATLAS and CMS experiments at the LHC has put significant pressure on a principle which has guided much theorizing in high energy physics over the last 40 years, the principle of naturalness. In this paper, I provide an explication of the conceptual foundations and physical significance of the naturalness principle. I argue that the naturalness principle is well-grounded both empirically and in the theoretical structure of effective field theories, and that it was reasonable for physicists to endorse it. Its possible failure to be realized in nature, as suggested by recent LHC data, thus represents an empirical challenge to certain foundational aspects of our understanding of QFT. In particular, I argue that its failure would undermine one class of recent proposals which claim that QFT provides us with a picture of the world as being structured into quasi-autonomous physical domains.
Article
This chapter will review selected aspects of the terrain of discussions about probabilities in statistical mechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statistical mechanics. The reason for this is the same reason that convinced Maxwell, Gibbs, and Boltzmann that probabilities would be needed, namely, that the second law of thermodynamics, which in its original formulation says that certain processes are impossible, must, on the kinetic theory, be replaced by a weaker formulation according to which what the original version deems impossible is merely improbable. Second is that we ought not take the standard measures invoked in equilibrium statistical mechanics as giving, in any sense, the correct probabilities about microstates of the system. We can settle for a much weaker claim: that the probabilities for outcomes of experiments yielded by the standard distributions are effectively the same as those yielded by any distribution that we should take as a representing probabilities over microstates. Lastly, (and most controversially): in asking about the status of probabilities in statistical mechanics, the familiar dichotomy between epistemic probabilities (credences, or degrees of belief) and ontic (physical) probabilities is insufficient; the concept of probability that is best suited to the needs of statistical mechanics is one that combines epistemic and physical considerations.
Article
This book defends the view that the Everett interpretation of quantum theory, often called the 'many worlds theory', is not some new physical theory or some metaphysical addition to quantum theory, but simply quantum theory itself understood in a straightforwardly literal way. As such - despite its radical implications for the nature of our universe - the Everett interpretation is actually the conservative way to approach quantum theory, requiring revisions neither to our best theories of physics, nor to conventional philosophy of science. The book is in three parts. Part I explains how quantum theory implies the existence of an emergent branching structure in physical reality, and explores the conceptual and technical details of decoherence theory, the theory which allows us to quantify that branching. Part II is concerned with the problem of probability, and makes the case that probability, far from being the key difficulty for the Everett interpretation, actually makes more sense from a many-worlds viewpoint. Part III explores the implications of an Everettian perspective on a variety of topics in physics and philosophy.
Article
The usual interpretation of the quantum theory is self-consistent, but it involves an assumption that cannot be tested experimentally, viz., that the most complete possible specification of an individual system is in terms of a wave function that determines only probable results of actual measurement processes. The only way of investigating the truth of this assumption is by trying to find some other interpretation of the quantum theory in terms of at present "hidden" variables, which in principle determine the precise behavior of an individual system, but which are in practice averaged over in measurements of the types that can now be carried out. In this paper and in a subsequent paper, an interpretation of the quantum theory in terms of just such "hidden" variables is suggested. It is shown that as long as the mathematical theory retains its present general form, this suggested interpretation leads to precisely the same results for all physical processes as does the usual interpretation. Nevertheless, the suggested interpretation provides a broader conceptual framework than the usual interpretation, because it makes possible a precise and continuous description of all processes, even at the quantum level. This broader conceptual framework allows more general mathematical formulations of the theory than those allowed by the usual interpretation. Now, the usual mathematical formulation seems to lead to insoluble difficulties when it is extrapolated into the domain of distances of the order of 10-13 cm or less. It is therefore entirely possible that the interpretation suggested here may be needed for the resolution of these difficulties. In any case, the mere possibility of such an interpretation proves that it is not necessary for us to give up a precise, rational, and objective description of individual systems at a quantum level of accuracy.
Article
I attempt to get as clear as possible on the chain of reasoning by which irreversible macrodynamics is derivable from time-reversible microphysics, and in particular to clarify just what kinds of assumptions about the initial state of the universe, and about the nature of the microdynamics, are needed in these derivations. I conclude that while a "Past Hypothesis" about the early Universe does seem necessary to carry out such derivations, that Hypothesis is not correctly understood as a constraint on the early Universe's entropy.
Article
Astronomical observations indicate that the cosmological constant is many orders of magnitude smaller than estimated in modern theories of elementary particles. After a brief review of the history of this problem, five different approaches to its solution are described.
Book
Scitation is the online home of leading journals and conference proceedings from AIP Publishing and AIP Member Societies
Article
The quantum state of a spatially closed universe can be described by a wave function which is a functional on the geometries of compact three-manifolds and on the values of the matter fields on these manifolds. The wave function obeys the Wheeler-DeWitt second-order functional differential equation. We put forward a proposal for the wave function of the "ground state" or state of minimum excitation: the ground-state amplitude for a three-geometry is given by a path integral over all compact positive-definite four-geometries which have the three-geometry as a boundary. The requirement that the Hamiltonian be Hermitian then defines the boundary conditions for the Wheeler-DeWitt equation and the spectrum of possible excited states. To illustrate the above, we calculate the ground and excited states in a simple minisuperspace model in which the scale factor is the only gravitational degree of freedom, a conformally invariant scalar field is the only matter degree of freedom and Λ>0. The ground state corresponds to de Sitter space in the classical limit. There are excited states which represent universes which expand from zero volume, reach a maximum size, and then recollapse but which have a finite (though very small) probability of tunneling through a potential barrier to a de Sitter-type state of continual expansion. The path-integral approach allows us to handle situations in which the topology of the three-manifold changes. We estimate the probability that the ground state in our minisuperspace model contains more than one connected component of the spacelike surface.
Article
Contemporary classics on the the major approaches to emergence found in contemporary philosophy and science, with chapters by such prominent scholars as John Searle, Stephen Weinberg, William Wimsatt, Thomas Schelling, Jaegwon Kim, Daniel Dennett, Herbert Simon, Stephen Wolfram, Jerry Fodor, Philip Anderson, David Chalmers, and others. Emergence, largely ignored just thirty years ago, has become one of the liveliest areas of research in both philosophy and science. Fueled by advances in complexity theory, artificial life, physics, psychology, sociology, and biology and by the parallel development of new conceptual tools in philosophy, the idea of emergence offers a way to understand a wide variety of complex phenomena in ways that are intriguingly different from more traditional approaches. This reader collects for the first time in one easily accessible place classic writings on emergence from contemporary philosophy and science. The chapters, by such prominent scholars as John Searle, Stephen Weinberg, William Wimsatt, Thomas Schelling, Jaegwon Kim, Robert Laughlin, Daniel Dennett, Herbert Simon, Stephen Wolfram, Jerry Fodor, Philip Anderson, and David Chalmers, cover the major approaches to emergence. Each of the three sections ("Philosophical Perspectives," "Scientific Perspectives," and "Background and Polemics") begins with an introduction putting the chapters into context and posing key questions for further exploration. A bibliography lists more specialized material, and an associated website (http://mitpress.mit.edu/emergence) links to downloadable software and to other sites and publications about emergence. ContributorsP. W. Anderson, Andrew Assad, Nils A. Baas, Mark A. Bedau, Mathieu S. Capcarrère, David Chalmers, James P. Crutchfield, Daniel C. Dennett, J. Doyne Farmer, Jerry Fodor, Carl Hempel, Paul Humphreys, Jaegwon Kim, Robert B. Laughlin, Bernd Mayer, Brian P. McLaughlin, Ernest Nagel, Martin Nillson, Paul Oppenheim, Norman H. Packard, David Pines, Steen Rasmussen, Edmund M. A. Ronald, Thomas Schelling, John Searle, Robert S. Shaw, Herbert Simon, Moshe Sipper, Stephen Weinberg, William Wimsatt, and Stephen Wolfram Bradford Books imprint
Article
An explicit model allowing a unified description of microscopic and macroscopic systems is exhibited. First, a modified quantum dynamics for the description of macroscopic objects is constructed and it is shown that it forbids the occurrence of linear superpositions of states localized in far-away spatial regions and induces an evolution agreeing with classical mechanics. This dynamics also allows a description of the evolution in terms of trajectories. To set up a unified description of all physical phenomena, a modification of the dynamics, with respect to the standard Hamiltonian one, is then postulated also for microscopic systems. It is shown that one can consistently deduce from it the previously considered dynamics for the center of mass of macroscopic systems. Choosing in an appropriate way the parameters of the so-obtained model one can show that both the standard quantum theory for microscopic objects and the classical behavior for macroscopic objects can all be derived in a consistent way. In the case of a macroscopic system one can obtain, by means of appropriate approximations, a description of the evolution in terms of a phase-space density distribution obeying a Fokker-Planck diffusion equation. The model also provides the basis for a conceptually appealing description of quantum measurement.
Article
It is generally thought that objective chances for particular events different from 1 and 0 and determinism are incompatible. However, there are important scientific theories whose laws are deterministic but which also assign non-trivial probabilities to events. The most important of these is statistical mechanics whose probabilities are essential to the explanations of thermodynamic phenomena. These probabilities are often construed as ‘ignorance’ probabilities representing our lack of knowledge concerning the microstate. I argue that this construal is incompatible with the role of probability in explanation and laws. This is the ‘paradox of deterministic probabilities’. After surveying the usual list of accounts of objective chance and finding them inadequate I argue that an account of chance sketched by David Lewis can be modified to solve the paradox of deterministic probabilities and provide an adequate account of the probabilities in deterministic theories like statistical mechanics.
Article
The concept of typicality refers to properties holding for the “vast majority” of cases and is a fundamental idea of the qualitative approach to dynamical problems. We argue that measure-theoretical typicality would be the adequate viewpoint for the role of probability in classical statistical mechanics, particularly in understanding the micro to macroscopic change of levels of description.
Article
We prove that Weyl invariant theories of gravity possess a remarkable property which, under very general assumptions, explains the stability of flat space-time. We show explicitly how conformal invariance is broken spontaneously by the vacuum expectation value of an unphysical scalar field; this process induces general relativity as an effective long distance limit.
Article
The report presents an exhaustive review of the recent attempt to overcome the difficulties that standard quantum mechanics meets in accounting for the measurement (or macro-objectification) problem, an attempt based on the consideration of nonlinear and stochastic modifications of the Schrödinger equation. The proposed new dynamics is characterized by the feature of not contradicting any known fact about microsystems and of accounting, on the basis of a unique, universal dynamical principle, for wavepacket reduction and for the classical behavior of macroscopic systems. We recall the motivations for the new approach and we briefly review the other proposals to circumvent the above mentioned difficulties which appeared in the literature. In this way we make clear the conceptual and historical context characterizing the new approach. After having reviewed the mathematical techniques (stochastic differential calculus) which are essential for the rigorous and precise formulation of the new dynamics, we discuss in great detail its implications and we stress its relevant conceptual achievements. The new proposal requires also to work out an appropriate interpretation; a procedure which leads us to a reconsideration of many important issues about the conceptual status of theories based on a genuinely Hilbert space description of natural processes. Attention is also paid to many problems which are naturally raised by the dynamical reduction program. In particular we discuss the possibility and the problems one meets in trying to develop an analogous formalism for the relativistic case. Finally we discuss the experimental implications of the new dynamics for various physical processes which should allow, in principle, to test it against quantum mechanics. The review covers the work which has been done in the last 15 years by various scientists and the lively debate which has accompanied the elaboration of the new proposal.
Article
The basic notion of an objective probability is that of a probability determined by the physical structure of the world. On this understanding, there are subjective credences that do not correspond to objective probabilities, such as credences concerning rival physical theories. The main question for objective probabilities is how they are determined by the physical structure.In this paper, I survey three ways of understanding objective probability: stochastic dynamics, humean chances, and deterministic chances (typicality). The first is the obvious way to understand the probabilities of quantum mechanics via a collapse theory such as GRW, the last is the way to understand the probabilities in the context of a deterministic theory such as Bohmian mechanics. Humean chances provide a more abstract and general account of chances locutions that are independent of dynamical considerations.
Article
Treatment of the predictive aspect of statistical mechanics as a form of statistical inference is extended to the density-matrix formalism and applied to a discussion of the relation between irreversibility and information loss. A principle of "statistical complementarity" is pointed out, according to which the empirically verifiable probabilities of statistical mechanics necessarily correspond to incomplete predictions. A preliminary discussion is given of the second law of thermodynamics and of a certain class of irreversible processes, in an approximation equivalent to that of the semiclassical theory of radiation. It is shown that a density matrix does not in general contain all the information about a system that is relevant for predicting its behavior. In the case of a system perturbed by random fluctuating fields, the density matrix cannot satisfy any differential equation because rho˙(t) does not depend only on rho(t), but also on past conditions The rigorous theory involves stochastic equations in the type rho(t)=G(t, 0)rho(0), where the operator G is a functional of conditions during the entire interval (0-->t). Therefore a general theory of irreversible processes cannot be based on differential rate equations corresponding to time-proportional transition probabilities. However, such equations often represent useful approximations.
Article
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.
Book
It is often supposed that the spectacular successes of our modern mathematical sciences support a lofty vision of a world completely ordered by one single elegant theory. In this book Nancy Cartwright argues to the contrary. When we draw our image of the world from the way modern science works - as empiricism teaches us we should - we end up with a world where some features are precisely ordered, others are given to rough regularity and still others behave in their own diverse ways. This patchwork makes sense when we realise that laws are very special productions of nature, requiring very special arrangements for their generation. Combining classic and newly written essays on physics and economics, The Dappled World carries important philosophical consequences and offers serious lessons for both the natural and the social sciences.
Book
A high-level phenomenon is strongly emergent with respect to a lowlevel domain when the high-level phenomenon arises from the low-level domain, but truths concerning that phenomenon are not deducible even in principle from truths in the low-level domain. Strong emergence is the notion of emergence that is most common in philosophical discussions of emergence, and is the notion invoked by the British emergentists of the 1920s. A high-level phenomenon is weakly emergent with respect to a lowlevel domain when the high-level phenomenon arises from the low-level domain, but truths concerning that phenomenon are unexpected given the principles governing the low-level domain. Weak emergence is the notion of emergence that is most common in recent scientific discussions of emergence, and is the notion that is typically invoked by proponents of emergence in complex systems theory. In a way, the philosophical morals of strong emergence and weak emergence are diametrically opposed.
Article
This paper is based on four assumptions: 1. Physical reality is made of linearly behaving components combined in non-linear ways. 2. Higher level behaviour emerges from this lower level structure. 3. The way the lower level elements behaves depends on the context in which they are imbedded. 4. Quantum theory applies to the lower level entities. An implication is that higher level effective laws, based in the outcomes of non-linear combinations of lower level linear interactions, will generically not be unitary; hence the applicability of quantum theory at higher levels is strictly limited. This leads to the view that both state vector preparation and the quantum measurement process are crucially based in top-down causal effects, and helps provide criteria for the Heisenberg cut that challenge some views on Schroedinger's cat.
Article
This book is an attempt to get to the bottom of an acute and perennial tension between our best scientific pictures of the fundamental physical structure of the world and our everyday empirical experience of it. The trouble is about the direction of time. The situation (very briefly) is that it is a consequence of almost every one of those fundamental scientific pictures--and that it is at the same time radically at odds with our common sense--that whatever can happen can just as naturally happen backwards. Albert provides an unprecedentedly clear, lively, and systematic new account--in the context of a Newtonian-Mechanical picture of the world--of the ultimate origins of the statistical regularities we see around us, of the temporal irreversibility of the Second Law of Thermodynamics, of the asymmetries in our epistemic access to the past and the future, and of our conviction that by acting now we can affect the future but not the past. Then, in the final section of the book, he generalizes the Newtonian picture to the quantum-mechanical case and (most interestingly) suggests a very deep potential connection between the problem of the direction of time and the quantum-mechanical measurement problem. The book aims to be both an original contribution to the present scientific and philosophical understanding of these matters at the most advanced level, and something in the nature of an elementary textbook on the subject accessible to interested high-school students. Table of Contents: Preface 1. Time-Reversal Invariance 2. Thermodynamics 3. Statistical Mechanics 4. The Reversibility Objections and the Past-Hypothesis 5. The Scope of Thermodynamics 6. The Asymmetries of Knowledge and Intervention 7. Quantum Mechanics Appendix: Gedankenexperiments with Heat Engines Index Reviews of this book: The foundations of statistical mechanisms are often presented in physics textbooks in a rather obscure and confused way. By challenging common ways of thinking about this subject, Time and Chance can do quite a lot to improve this situation. --Jean Bricmont, Science Albert is perfecting a style of foundational analysis that is uniquely his own...It has a surgical precision...and it is ruthless with pretensions. The foundations of thermodynamics is a topic that has accumulated a good deal of dead wood; this is a fire that will burn and burn. --Simon W. Saunders, Oxford University As usual with Albert's work, the exposition is brisk and to the point, and exceptionally clear...The book will be an extremely valuable contribution to the literature on the subject of philosophical issues in thermodynamics and statistical mechanics, a literature which has been thin on the ground but is now growing as it deserves to. --Lawrence Sklar, University of Michigan
Article
I consider the arguments to show that the vacuum energy density should receive a large contribution from the zero-point energy. This is the cosmological constant problem, as it was originally framed. I suggest that the matter is interpretation-dependent, and that on certain approaches to foundations, notably Everett's, the problem is a formal one, rather than one based on physical principles.
Article
This is an introduction to the method of effective field theory. As an application, I derive the effective field theory of low energy excitations in a conductor, the Landau theory of Fermi liquids, and explain why the high-TcT_c superconductors must be described by a different effective field theory.
The Big Picture: On The Origins of Life, Meaning and the Universe Itself
  • S Carroll
Carroll, S. (2017). The Big Picture: On The Origins of Life, Meaning and the Universe Itself. New York: Penguin Random House.
The Re-Emergence of Emergence: The Emergentist Hypothesis from Science to Religion
  • D J Chalmers
Chalmers, D. J. (2008). Strong and weak emergence. In P. Clayton and P. Davies (Eds.), The Re-Emergence of Emergence: The Emergentist Hypothesis from Science to Religion, pp. 244-256. Oxford: Oxford University Press.