Article

The Big Bad Bug: What are the Humean's Chances?

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Humean supervenience is the doctrine that there are no necessary connections in the world. David Lewis identifies one big bad bug to the programme of providing Humean analyses for apparently non-Humean features of the world. The bug is chance. We put the bug under the microscope, and conclude that chance is no special problem for the Humean.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... But -type worlds block the implication in the other direction: from non-zero chance values to possible worlds. This latter implication is otherwise known as the Basic Chance Principle (Bigelow et al., 1993). ...
... will, contrary to its more typical presentation in the literature, focus on its metaphysical aspect and the resulting denial of a very intuitively appealing principle of chance: the Basic Chance Principle (Bigelow et al., 1993). Lewis (1994) and Hall (1994) have proposed essentially the same fix regarding the epistemic manifestation. ...
... As noted above, and in section 2.3, the problem with -type worlds is that they contradict the Basic Chance Principle (BCP) (Bigelow et al., 1993) by breaking the implication from non-zero chance values to possible worlds. ...
Article
Metaphysical theories of lawhood may be grouped broadly into three groups: those that deny the mind-independent existence of laws of nature, those that assert both their mind-independent existence and their existence independent of the physical world, and finally those that assert their mind-independent existence, but deny their existence independent of the physical world. Of the last type of theory is Lewis’s best system account (BSA) of lawhood. This theory states that laws are supervenient on the Humean mosaic of each world, and are those sets of propositions that best summarise the regularities in nature. ‘Best’ in this context means optimal in terms of strength and simplicity, and in the case of indeterministic worlds, fit. The BSA in its Humean formulation after Lewis suffers from many problems. The first of these is the problem of undermining futures, where the BSA-determined chance functions appearing in indeterministic laws may be inconsistent with other possible worlds that should in theory have exactly the same laws. The second problem is the zero-fit problem, where worlds with infinite sequences of chance events would have as equally good fit as an infinite number of other worlds, and thus all worlds would have zero fit according to traditional notions of statistical fit measured as the similarity between two sequences of chance events. The third problem is the problem of non-locality, where quantum mechanics suggests some predicates apply to spatially non-contiguous entities, thus implying the apparent falsehood of the foundational principle in Lewis’s version of the BSA of Humeanism. Finally, Armstrong’s problem, where the simplicity-strength trade-off is language dependent, with some languages allowing for expressions that are simpler than an equivalent translation in some other language due to differing vocabulary. Solutions will be provided to each of these problems in turn. The solution to the problem of undermining and to Armstrong’s problem both relies on a background assumption of modal realism (in the case of the solution to the problem of undermining, modal realism in the strongest sense of each world being as concrete as the actual world, rather than logical constructions). Briefly, in the case of the problem of undermining, the proposed solution involves applying the algorithm to identify the best systematization over the Humean mosaic representing a multiverse of possible worlds, rather than a single world. This multiverse is shown to be constructible without circularity, and would necessarily comprise all but the most extreme cases of undermining worlds. For Armstrong’s problem, it is shown how an objective set of predicates may be defined. These predicates are identified as those containing the least amount of contingent information. This concept of contingent information is given formal mathematical expression using analogous reasoning to the concept of Shannon information or entropy as used in communication theory and statistical mechanics. The solutions presented for the zero-fit problem and the problem of non-locality are rather more straightforward. In the case of the zero-fit problem, an alternative measure of fit is presented. This alternative measure, known to statisticians as -square, is shown to give a non-zero measure over both finite and infinite probability spaces. The solution to the problem of non-locality is to first dismiss it as irrelevant, since the non-local predicates are supervenient, and not part of the fundamental ontology of the Humean mosaic. This response in itself is not original. What is original however is the speculation that such non-local properties could not in fact be fundamental if the solution to Armstrong’s problem mentioned above is accepted, since such non-local predicates would contain more contingent information than extensionally equivalent combinations of local predicates. Proponents of theories of lawhood that assert both the mind-independence of laws and their independence of the physical world often do so premised on the failure of those that attempt a physically supervenient account of lawhood. In conclusion therefore, resolution of these four outstanding problems with the leading supervenient theory of lawhood makes some progress towards resolution of this age-old problem in metaphysics of what laws of nature are.
... To keep this discussion as simple as possible I will concentrate on this case. For most applications of QM it su¢ ces to use an H that is separable, but it is not hard to conceive idealized cases that require the use of a nonseparable H. 3 A projection E 2 B(H) is a self-adjoint element such that E 2 = E. The range Ran(E) of E is a closed subspace of H, and for B(H) the projections are in one-one correspondence with the closed subspaces of H. ...
... Downward and upward: objective vs. subjective interpretations of quantum probabilities3.1 Objectivist take: top downThe top-down approach to quantum probabilities lends itself to an objectivist reading of probabilities. On this reading quantum states codify objective, observer-independent physical features of quantum systems and, thus, the probabilities states induce are objective physical probabilities-chances, if you like. ...
Article
David Lewis' "Principal Principle" is a purported principle of rationality connecting credence and objective chance. Almost all of the discussion of the Principal Principle in the philosophical literature assumes classical probability theory, which is unfortunate since the theory of modern physics that, arguably, speaks most clearly of objective chance is the quantum theory, and quantum probabilities are not classical probabilities. Given the generally accepted updating rule for quantum probabilities, there is a straight forward sense in which the Principal Principle is a theorem of quantum probability theory for any credence function satisfying a suitable additivity requirement. No additional principle of rationality is needed to bring credence into line with objective chance.
... Several philosophers have claimed that any notion of chance (physical probability) should satisfy certain conditions that appear to be constitutive of chance or might be regarded as platitudes we apparently have about chance (e.g., Loewer 2001;Schaffer 2003Schaffer , 2007. 1 Two such conditions on chance have been proposed in Bigelow et al. (1993). To introduce these conditions, let us assume that Ch denotes a chance function over a finite algebra of propositions A. 2 Suppose further that C F ¼ fch w : w 2 Wg is a finite set of chance functions over A indexed by the possible worlds in W. 3 Each ch w stands for a chance function in some possible world w. ...
... The Bug is essentially an epistemological argument: it appeals to the Principal Principle which tells us how an agent's credences in propositions concerning chances should be related to her credences in other propositions. But, as shown in Bigelow et al. (1993) and in Briggs (2009), a version of the Bug-a genuine metaphysical argument (hereafter, the metaphysical Bug)-can be presented without making this epistemological detour. The key is to replace the Principal Principle with a non-epistemic condition C1. ...
Article
Full-text available
This paper shows how a particular resiliency-centered approach to chance lends support for two conditions characterizing chance. The first condition says that the present chance of some proposition A conditional on the proposition about some later chance of A should be set equal to that later chance of A. The second condition requires the present chance of some proposition A to be equal to the weighted average of possible later chances of A. I first introduce, motivate, and make precise a resiliency-centered approach to chance whose basic idea is that any chance distribution should be maximally invariant under variation of experimental factors. Second, I show that any present chance distribution that violates the two conditions can be replaced by another present chance distribution that satisfies them and is more resilient under variation of experimental factors. This shows that the two conditions are an essential feature of chances that maximize resiliency. Finally, I explore the relationship between the idea of resilient chances so understood and so-called Humean accounts of chance—one of the most promising recent philosophical accounts of chance.
... 6 In referring to chances as single-case probabilities, I am following the tradition of Lewis (1980). See also Bigelow, Collins, and Pargetter (1993) and Hall (1994). The philosophical discussion of chance, however, is fraught by ambiguous usage. ...
... These programmatic remarks about chance are supported in more precise detail in Ismael (2011b, 201). For some of the background on chance, see Bigelow, Collins, and Pargetter (1993) and Hall (1994). ...
... Of course, as soon as one utters this complaint the de Finetti issue resurfaces since interpretive principles are needed to tease a theory of chance from a textbook on a theory of physics, and de Finetti's heirs-the self-styled quantum Bayesians (QBians)-maintain that the probability statements that the quantum theory provide are to be given a personalistic interpretation. 2 And this leads to the next curious and disturbing feature of the literature. The bulk of the philosophical discussion is couched in terms of classical probability theory, without any apparent recognition of the facts that quantum probability theory is not classical probability theory and that the way in which quantum probabilities are generated o¤ers ways of linking credence and chance that have a bearing on the PP. ...
Article
In place of the just-so stories and intuition mongering of analytical metaphysicians, I offer a program for understanding the relationship between credence and chance in quantum physics and show how a version of the program can be implemented with the help of some representation theorems.
... Add to these the Basic Chance Principle (BCP) (Bigelow et al., 1993), the principle, roughly, that the chance for an event E at some future time may be the same, as determined by events up to some earlier time, whether or not E actually happens. Indexing to worlds and to times it is the principle: ...
Article
The Everett interpretation of quantum mechanics divides naturally into two parts: first, the interpretation of the structure of the quantum state, in terms of branching, and second, the interpretation of this branching structure in terms of probability. This is the second of two reviews of the Everett interpretation, and focuses on probability. Branching processes are identified as chance processes, and the squares of branch amplitudes are chances. Since branching is emergent, physical probability is emergent as well.
... Add to these the Basic Chance Principle (BCP) (Bigelow et al., 1993), the principle, roughly, that the chance for an event E at some future time may be the same, as determined by events up to some earlier time, whether or not E actually happens. Indexing to worlds and to times it is the principle: ...
Preprint
Full-text available
The Everett interpretation of quantum mechanics divides naturally into two parts: first, the interpretation of the structure of the quantum state, in terms of branching, and second, the interpretation of this branching structure in terms of probability. This is the second of two reviews of the Everett interpretation, and focuses on probability. Branching processes are identified as chance processes, and the squares of branch amplitudes are chances. Since branching is emergent, physical probability is emergent as well.
... Further, counterfactual probabilities seem to play other roles associated with chance. Consider, for example, the commonly discussed connection between between chance and possibility (Bigelow et al. 1993;Schaffer 2007)-intuitively that when an event has a non-zero chance there is a relevant physically possible world where it occurs. This comes easily for counterfactual probability, since counterfactual probability is just a measure over relevant physically possibility worlds. ...
Article
Full-text available
Why do we value higher-level scientific explanations if, ultimately, the world is physical? An attractive answer is that physical explanations often cite facts that don’t make a difference to the event in question. I claim that to properly develop this view we need to commit to a type of deterministic chance. And in doing so, we see the theoretical utility of deterministic chance, giving us reason to accept a package of views including deterministic chance.
... Of course, as soon as one utters this complaint the de Finetti issue resurfaces since interpretive principles are needed to tease a theory of chance from a textbook on a theory of physics, and de Finetti's heirs-the self-styled quantum Bayesians (QBians)-maintain that the probability statements that the quantum theory provide are to be given a personalistic interpretation. 2 And this leads to the next curious and disturbing feature of the literature. The bulk of the philosophical discussion is couched in terms of classical probability theory, without any apparent recognition of the facts that quantum probability theory is not classical probability theory and that the way in which quantum probabilities are generated o¤ers ways of linking credence and chance that have a bearing on PP. ...
Article
In place of the just-so stories and intuition mongering of analytical metaphysicians, I offer a program for understanding the relationship between credence and chance in quantum physics and show how a version of the program can be implemented with some representation theorems.
... The BCP is meant to formulate the platitude that if a proposition has some non-zero chance of being true, then that proposition is possible. This principle was first intro duced by Bigelow et al. [1993]. A way of formulating the principle is as follows: ...
Thesis
Early probability theorists often spoke of probability in a way that was ambiguous between two different concepts of probability: a subjective concept and an objective concept. Subsequent theorists distinguished these two concepts from another, defining the subjective concept as the “uncertainty of an individual”, and the objective concept as a kind of “uncertainty in the world”. While these two concepts were distinguished from one another, some theorists believed that there was no such thing as “uncertainty in the world”, and that the only type of probability is subjective probability. The advent of quantum mechanics changed this orthodoxy. Here, for the first time—or so it has been argued—a scientific theory described the world as irreducibly probabilistic, using an objective concept of probability. While there have been authors who have levelled serious objections to this view, it has nonetheless been fairly popular. Scientific theories, however, were probabilistic long before quantum mechanics was developed. Perhaps the two most prominent cases in point were the fields of classical statistical mechanics and evolutionary theory. These two fields made use of probability theory in a way that looked objective, but often with the assumptions that the world is not irreducibly probabilistic, and that for there to be genuine “uncertainty in the world”, the world has to be irreducibly probabilistic. This caused many authors to wonder just what the probabilities in these fields could be representing. Proposed solutions to this puzzle have often been in the form of shoe-horning the probabilities of these fields into either the “uncertainty in the world” concept or the “uncertainty of an individual” concept—both with unsatisfactory consequences. In this dissertation, I investigate how we should understand the probabilities of classical statistical mechanics and evolutionary theory. To do this, I engage with arguments in the contemporary literature, and conclude that the probabilities of these two fields should be understood as neither the subjective concept of probability, nor the objective concept—standardly conceived. I argue that in order to develop an adequate account of these probabilities, we need to distinguish a third concept of objective probability that has nothing to do with “uncertainty”, whether it be in the world or of an individual. I then give an analysis of this third concept of probability.
... Moreover, he notes that their possibility contradicts a very intuitive principle that he takes to be central to our conception of chance: the original version of the 'Principal Principle', which relates a rational agent's credences about the chance of an event to her credence that the event will occur. 8 One way to bring out the implausibility of the idea that undermining is possible is to note its inconsistency with the following very compelling principle, which is a slightly strengthened variant of what Bigelow, Collins and Pargetter (1993) call the 'Basic Chance Principle': ...
Article
Full-text available
The sample space of the chance distribution at a given time is a class of possible worlds. Thanks to this connection between chance and modality, one's views about modal space can have significant consequences in the theory of chance and can be evaluated in part by how plausible these implications are. I apply this methodology to evaluate certain forms of modal contingentism, the thesis that some facts about what is possible are contingent. Any modal contingentist view that meets certain conditions that I specify generates difficulties in the philosophy of chance, including a problem usually associated with Humeanism that is known as 'the problem of undermining futures'. I consider two well-known versions of modal contingentism that face this difficulty. The first version, proposed by Hugh Chandler and Nathan Salmon, rests on an argument for the claim that many individuals have their modal features contingently. The second version is motivated by the thesis that the existence of a possible world depends on the existence of the contingent individuals inhabiting it, and that many worlds are therefore contingent existents.
... Section 5 discusses a potential further benefit of my revised potency-BSA according to which it might evade the 'Big Bad Bug' (Bigelow et al. 1993) that afflicts Lewis's Humean laws-ontology package. ...
Article
I argue that an unHumean ontology of irreducibly dispositional properties might be fruitfully combined with what has typically been thought of as a Humean account of laws, namely, the best-system account , made popular by David Lewis (e.g., 1983, 1986, 1994). In this paper I provide the details of what I argue is the most defensible account of Humean laws in an unHumean world. This package of views has the benefits of upholding scientific realism while doing without any suspect metaphysical entities to account for natural law. I conclude by arguing that the Humean laws-unHumean ontology package is well placed to provide an account of objective, nontrivial chances , a famous stumbling block for the Humean laws-Humean ontology package developed by Lewis.
... Lewis is not alone in thinking that PP contradicts HS: variations on the above two-step argument have been approvingly rehearsed by Bigelow et al. (1993), Thau (1994), Hall (1994), Strevens (1995, and Black (1998). Nevertheless, I beg to differ: my first thesis in this paper is that there is a flaw in the above argument. ...
Article
Full-text available
The Principal Principle (PP) says that, for any proposition A, given any admissible evidence and the proposition that the chance of A is x%, one's conditional credence in A should be x%. Humean Supervenience (HS) claims that, among possible worlds like ours, no two differ without differing in the spacetime-point-by-spacetime-point arrangement of local properties. David Lewis (1986b, 1994a) has argued that PP contradicts HS, and the validity of his argument has been endorsed by Bigelow et al. (1993), Thau (1994), Hall (1994), Strevens (1995), Ismael (1996), Hoefer (1997), and Black (1998). Against this consensus, I argue that PP might not contradict HS: Lewis's argument is invalid, and every attempt - within a broad class of attempts - to amend the argument fails.
... 14The first of Schaffer's platitudes is PP itself, which, as we have just seen, leads to a contradiction when applied in deterministic setting. Since he grants PP the status of a platitude, PP itself cannot be given up, which leads us to the conclusion that only trivial chances are compatible with determinism.The Basic Chance Principle, originally due toBigelow, Collins and Pargetter (1993), asserts that if at time t there is a non-trivial objective chance for E in world w , then there is a possible world with the same history as w up to t and in which E is true. Schaffer proposes a stronger version of this principle, his Realization Principle, which adds the requirement that the possible world in which E is true must have the same laws as w. ...
Article
Determinism and chance seem to be irreconcilable opposites: either something is chancy or it is deterministic but not both. Yet there are processes which appear to square the circle by being chancy and deterministic at once, and the appearance is backed by well-confirmed scientific theories such as statistical mechanics which also seem to provide us with chances for deterministic processes. Is this possible, and if so how? In this essay I discuss this question for probabilities as they occur in the empirical sciences, setting aside metaphysical questions in connection with free will, divine intervention and determinism in history.
... Good analyses of chance ought not to allow for such absurdities. They ought to respect Bigelow, Collins and Pargetter's Basic Chance Principle (BCP): any outcome whose chance exceeds zero is compossible with that chance and with history to date (Bigelow et al [1993], p. 459). Reassuringly, my analysis entails the BCP. ...
Article
The view that chances are relative frequencies of occurrence within actual, finite reference classes has long been written off. I argue that it ought to be reconsidered. Focusing on non-deterministic chance, I defend a version of finite frequentism in which reference classmates are required to have qualitatively identical pasts. While my analysis can evade or resist several standard objections, it has a counterintuitive consequence: non-trivial chances entail the existence of past light cones that are perfect intrinsic duplicates. In mitigation, I argue that our scientific knowledge is consistent with the hypothesis that there are many such duplicates in the actual world. Moreover, my analysis has some striking advantages: it is simple, it is metaphysically undemanding, and it makes possible a satisfying explanation of the chance–credence connection. • 1 Introduction • 2 Target • 3 Sketch and Skirmish • 4 Articulation • 5 Standard Worries • 5.1 Frequency tolerance • 5.2 Leibniz’s dictum • 5.3 Single case chance • 5.4 Missing values • 5.5 Explanation • 5.6 Counterfactual chances • 6 Chance and Credence • 6.1 An indefeasible indifference intuition • 6.2 Articulating the credence principle • 6.3 Appraising the credence principle • 7 Confirmation • 8 Laws • 9 Summing Up
... The literature on the problem of undermining futures has grown quite large. Here is a representative sample:Arntzenius and Hall (2003),Bigelow et al. (1993), Hall (1994,Lewis (1980), Schaffer (2003,Ward (2002). ...
Article
This book presents new work in philosophical theology on the universe, creation, and the afterlife. Organised thematically by the endpoints of time, the volume begins by addressing eschatological matters - the doctrines of heaven and hell - and ends with an account of divine deliberation and creation. This book develops a coherent theistic outlook which reconciles a traditional, high conception of deity, with full providential control over all aspects of creation, with a conception of human beings as free and morally responsible. The resulting position and defence is labelled 'Philosophical Arminianism', and deserves attention in a broad range of religious traditions.
... SOPHISTICATED REGULARITY THEORIES (such as Lewis's) have a high degree of accordance with our counterfactual reasoning, but the accordance does not seem to be perfect. As defenders of Humean supervenience are well aware, there are some cases that strike anti-Humeans as deeply counterintuitive (Carroll's [1994] mirror case, chance-involving cases [Bigelow, Collins, and Pargetter 1993], etc.). It seems that Humean views will necessarily have such features. ...
Article
Marc Lange objects to scientific essentialists that they can give no better account of the counterfactual invariance of laws than Humeans. While conceding this point succeeds ad hominem against some essentialists, I show that it does not undermine essentialism in general. Moreover, Lange's alternative account of the relation between laws and counterfactuals is—with minor modification—compatible with essentialism.
... So there exists a world w 0 at which (i) A is true, (ii) w 0 matches w up to t: H is true, and (iii) L*-ch tw 0 (A)¼j: T is true. In fact, L*-chance meets an even stronger principle than the BCP, namely that 'the complete theory of chance is not chancy' ( Bigelow et al. [1993], p. 456): L*-ch(T)¼L-ch(T/T)¼1. (Intuitively, if one makes a million draws without replacement from an urn with a million marbles, one can be certain that the ratio of black-to-white drawings will equal the ratio of black-to-white marbles.) ...
Article
There are at least three core principles that define the chance role: (1) the Principal Principle, (2) the Basic Chance Principle, and (3) the Humean Principle. These principles seem mutually incompatible. At least, no extant account of chance meets more than one of them. I offer an account of chance which meets all three: L*-chance. So the good news is that L*-chance meets (1)–(3). The bad news is that L*-chance turns out unlawful and unstable. But perhaps this is not such bad news: L*-chance turns out to at least approximate plausible additional core principles concerning lawfulness and stability. And perhaps there is better news: one may treat 'chance' as vague, in a way that allows every core principle of chance to be met.
Article
In this paper we motivate the ‘principles of trust’, chance-credence principles that are strictly stronger than the New Principle yet strictly weaker than the Principal Principle, and argue, by proving some limitative results, that the principles of trust conflict with Humean Supervenience.
Chapter
After the introductory first chapter, the second chapter contains the discussion of the concept of randomness in relation to the element of chance with reference to the Commonplace Thesis. Subjective and objective probabilities are viewed from the perspective of modern approaches. The debate between the Frequentist and Bayesian approaches are also discussed. The Compensated Bayes’ rule function is introduced as a dynamic interpretation of classical Bayes’ rule. The facts and artifacts of scientific reality from personal beliefs to chance situations are investigated. Principles of chance are examined as they constitute building blocks for understanding the nature of randomness. The chapter is concluded with the epistemological demarcation between personal beliefs, knowledge and reality.
Article
According to libertarianism about free will, at least some of the choices we make are free and undetermined. Many libertarians also accept the thesis that, before we make an undetermined choice, there is a nontrivial objective probability that we will make that choice. In the literature on free will, the ascription of objective probabilities is sometimes justified via an “Argument from Motivation,” which adverts to the fact that typically, in situations of choice, we are more motivated to choose some options over others. In this paper, I will examine this argument and I will argue that it is unsound, as one of its premises is at odds with a widely accepted principle governing the evolution of objective probabilities over time.
Article
Full-text available
Um argumento recorrente contra a liberalização do aborto parte do pressuposto de que, desde o momento da fertilização, seres humanos são indivíduos (no sentido de serem algo que necessariamente ocorre em uma entidade apenas). Nesse artigo, parto da possibilidade de geminação monozigótica e do fato de identidades serem necessárias para argumentar que esse não é o caso. No artigo, discuto as premissas do argumento e as possíveis interpretações de sua conclusão. Argumento que a conclusão desse argumento está em concordância com a conclusão de um argumento semelhante, que parte de noções correntes de ‘organismo’.
Article
In his ‘A Subjectivist’s Guide to Objective Chance’, Lewis argued that a particular kinematical model for chances (physical probabilities) follows from his principal principle. According to this model, any later chance function is equal to an earlier chance function conditional on the complete intervening history of non-modal facts. This article first investigates the conditions that any kinematical model for chance needs to satisfy to count as Lewis’s kinematics of chance. Second, it presents Lewis’s justification for his kinematics of chance and explains why it is bound to be problematic. Third, it gives an alternative justification for Lewis’s kinematics of chance that does not appeal to the principal principle. Instead, this justification appeals to a well-supported requirement for chance, according to which any prior chance function must be a convex combination of the possible posterior chance functions. It is shown that under a plausible assumption, Lewis’s kinematics of chance is equivalent to this requirement. Finally, by focusing on this requirement, it is explained why the so-called self-undermining chances fail to obey Lewis’s kinematics of chance. 1Introduction2Lewis’s Kinematics of Chance: A Precise Formulation3The Principal Principle and Lewis’s Kinematics of Chance4Do We Need a New Justification?5A New Argument for Lewis’s Kinematics of Chance6Some Consequences: Self-undermining Chances7Concluding Remarks
Article
Drawing on Earman’s (1986) definition of determinism and Lewis’ (Australasian Journal of Philosophy, 61, 343–377, 1983) best systems account of laws, in What Makes Time Special? (2017) Craig Callender develops an account of time as ‘the strongest thing’. The characterisation of this account apparently assumes no action at a temporal distance, an assumption that also underlies Earman’s account of determinism. In this paper I show that there is a way to define determinism that allows worlds with action at a temporal distance to count as deterministic, that action at a temporal distance is possible on a best systems account of laws, and hence that Callender need not make this assumption.
Article
David Lewis? interpretation of objective probability has two essential parts: Humean supervenience and Best system of laws. According to his interpretation, probabilities, along with the other nomic phenomena, supervene on the actual facts. Lewis also famously formulated the Principal Principle, which should show the connection between objective and subjective probabilities. However, years later, Hall, Thau and Lewis himself came to conclusion that the principle and Lewis? interpretation of probability are not compatible. The main reason for that is the problem of so called undermining futures: Lewis named the problem ?Big Bad Bug?. A popular way to solve the problem was to change the principal principle. In this article, I will argue that the origin of the problem is not compatibility of the principle and Lewis?s interpretation of probability, but that the problem is in the interpretation itself. Changing the principle, I argue, will not conclusively solve the problem.
Article
Full-text available
A previously unrecognised argument against deterministic chance is introduced. The argument rests on the twin ideas that determined outcomes are settled, while chancy outcomes are unsettled, thus making cases of determined but chancy outcomes impossible. Closer attention to tacit assumptions about settledness makes available some principled lines of resistance to the argument for compatibilists about chance and determinism. Yet the costs of maintaining compatibilism may be higher with respect to this argument than with respect to existing incompatibilist arguments.
Article
We review the question of whether objective chances are compatible with determinism. We first outline what we mean by chance and what we mean by determinism. We then look at the alleged incompatibility between those concepts. Finally, we look at some ways that one might attempt to overcome the incompatibility.
Article
I argue that the past can be objectively chancy in cases of backwards causation, and defend a view of chance that allows for this. Using a case, I argue against the popular temporal view of chance, according to which (i) chances are defined relative to times, and (ii) all chancy events must lie in the future. I then state and defend the causal view of chance, according to which (a) chances are defined relative to causal histories, and (b) all chancy events must lie causally downstream. The causal view replicates the intuitively correct results of the temporal view in cases of ordinary forwards causation, while correctly handling cases of backwards causation. I conclude that objective chance is more closely related to the direction of causation than it is to the direction of time.
Chapter
David Lewis claimed that deterministic chance was impossible. But deterministic chance seems ubiquitous in casinos, in statistical mechanics, and in evolutionary theory. It would be best for Lewis's metaphysics if, in spite of what he says, we could reconcile his core views with deterministic chance. In this chapter, the author briefly rebuts two Lewisian objections to deterministic chance. The first is that our world is indeterministic at the quantum level, and this lower-level indeterminism translates to indeterminism at higher levels. The chapter explains how deterministic chances are possible on a broadly Lewisian theory. It also explains how there can be deterministic chances that function as nomological magnitudes, guide credence, and arise in objectively chancy situations. It is true that the author's deterministic chances are not time-indexed. It is also true that they do not exactly satisfy principles, proposed by Lewis and others in a broadly Lewisian tradition, that presuppose time-indexing.
Chapter
Humean supervenience is the conjunction of three theses: Truth supervenes on being, Anti-haecceitism, and Spatiotemporalism. The first clause is a core part of Lewis's metaphysics. The second clause is related to Lewis's counterpart theory. The third clause says there are no fundamental relations beyond the spatiotemporal, or fundamental properties of extended objects. Supervenience is classified into strong modal Humean supervenience, local modal Humean supervenience and familiar modal Humean supervenience which states that: for any two "worlds like ours", if the spatiotemporal distribution of fundamental qualities is the same at each world, the contingent facts are also the same. The fact that quantum mechanics raises problems for Humean supervenience does not undercut the philosophical significance of Lewis's defense of Humean supervenience. Humean supervenience says that in a world like ours, the fundamental properties are local qualities: perfectly natural intrinsic properties of points, or of point-sized occupants of points.
Working Paper
Is it possible for Suzy to travel back in time and kill her infant self? Vihvelin (1996) and Vranas (2009) hold that autoinfanticide is logically/metaphysically possible but physically impossible. Horacek (2005) believes that autoinfanticide has a nonzero chance of occurring and so is physically possible, but believes that autoinfanticide is metaphysically impossible. To sort out these issues, I describe six ways to commit autoinfanticide; for all six, there is a case to be made for their physical and metaphysical possibility.
Article
A series of recent arguments purport to show that most counterfactuals of the form if A had happened then C would have happened are not true. These arguments pose a challenge to those of us who think that counterfactual discourse is a useful part of ordinary conversation, of philosophical reasoning, and of scientific inquiry. Either we find a way to revise the semantics for counterfactuals in order to avoid these arguments, or we find a way to ensure that the relevant counterfactuals, while not true, are still assertible. I argue that regardless of which of these two strategies we choose, the natural ways of implementing these strategies all share a surprising consequence: they commit us to a particular metaphysical view about chance.
Article
A common type of argument against the existence of God is to argue that certain essential features associated with the existence of God are inconsistent with certain other features to be found in the actual world. (Cf. Göcke (2013) for an analysis of the different ways to deploy the term “God” in philosophical and theological discourse and for an analysis of the logical form of arguments for and against the existence of God.) A recent example of this type of argument against the existence of God is based on the assumption that there are random processes or chancy states of affairs in the actual world that contradict God being absolute sovereign over his creation: Chancy states of affairs are said to entail a denial of divine providence or omniscience. (For instance, Smith (1993, p. 195) argues that “classical Big Bang cosmology is inconsistent with theism due to the unpredictable nature of the Big Bang singularity.”) More often than not, however, this apparent conflict is formulated only intuitively and lacks sufficient conceptual clarification of the crucial terms involved. As a consequence, it is seldom clear where the conflict really lies. In what follows, I first provide a brief analysis of chance and randomness before I turn to cosmological and evolutionary arguments against the existence of God that in some way or other are based on chance and randomness. I end by way of comparing three popular conceptions of God as regards their ability to deal with God’s relation to a world of chance and randomness. Neither classical theism, nor open theism, nor indeed process panentheism has difficulties in accounting for God’s relation to a world of chance and randomness.
Article
Full-text available
I argue that there are non-trivial objective chances (that is, objective chances other than 0 and 1) even in deterministic worlds. The argument is straightforward. I observe that there are probabilistic special scientific laws even in deterministic worlds. These laws project non-trivial probabilities for the events that they concern. And these probabilities play the chance role and so should be regarded as chances as opposed, for example, to epistemic probabilities or credences. The supposition of non-trivial deterministic chances might seem to land us in contradiction. The fundamental laws of deterministic worlds project trivial probabilities for the very same events that are assigned non-trivial probabilities by the special scientific laws. I argue that any appearance of tension is dissolved by recognition of the level-relativity of chances. There is therefore no obstacle to accepting non-trivial chance-role-playing deterministic probabilities as genuine chances.
Article
I argue against the common and influential view that non-trivial chances arise only when the fundamental laws are indeterministic. The problem with this view, I claim, is not that it conflicts with some antecedently plausible metaphysics of chance or that it fails to capture our everyday use of ‘chance’ and related terms, but rather that it is unstable. Any reason for adopting the position that non-trivial chances arise only when the fundamental laws are indeterministic is also a reason for adopting a much stronger, and far less attractive, position. I suggest an alternative account, according to which chances are probabilities that play a certain explanatory role: they are probabilities that explain associated frequencies. • 1 Introduction • 2 A Paradigm Case • 3 The Incompatibilist’s Criterion • 4 Against the Incompatibilist’s Criterion • 5 The Explanatory Criterion • 6 Conclusion
Article
David Lewis, Michael Thau, and Ned Hall have recently argued that the Principal Principle – an inferential rule underlying much of our reasoning about probability – is inadequate in certain respects, and that something called the ‘New Principle’ ought to take its place. This paper argues that the Principal Principle need not be discarded. On the contrary, Lewis et al. can get everything they need – including the New Principle – from the intuitions and inferential habits that inspire the Principal Principle itself, while avoiding the problems that originally caused them to abandon that principle.
Article
It is a standard view that the concept of chance is inextricably related to the technical concept of credence. One influential version of this view is that the chance role is specified by (something in the neighborhood of) David Lewis's Principal Principle, which asserts a certain definite relation between chance and credence. If this view is right, then one cannot coherently affirm that there are chance processes in the physical world while rejecting the theoretical framework in which credence is defined, namely the Bayesian framework. This is surprising; why should adopting a theory that says there are chances at work in nature put any particular constraints on our theorizing about epistemology and rational choice? It is quite plausible that in order for anything to count as the referent of our concept chance, it would have to be related to epistemic rationality in a certain way—roughly, it is rational to have more confidence that something will happen the greater you think its chance is. But this commonsensical idea does not seem to be inherently committed to any particular theoretical approach to rationality, so why should we think that adopting the Bayesian approach is a prerequisite for thinking coherently about chance? I propose and defend a replacement for the Principal Principle which makes no use of the concept of credence. I also argue that this replacement is advantageous for the project of theorizing about the nature of chance. • 1 The Entanglement of Chance with Credence • 2 Desiderata for a Replacement for PP • 3 Disentangling Chance from Credence • 4 What RP Demands of a Bayesian Subject • 5 How Narrowly RP Constrains the Chance Function • 6 An Objection • 7 An Unexpected Benefit • 8 Conclusion • Appendix A: Any Subject with Credences who Obeys PP also Obeys RP • Appendix B: Any Subject with Credences who Obeys RP also Approximately Obeys PP
Article
Faced with the paradox of undermining futures, Humeans have resigned themselves to accounts of chance that severely conflict with our intuitions. However, such resignation is premature: The problem is Humean Supervenience (HS), not Humeanism. This paper develops a projectivist Humeanism on which chance claims are understood as normative, rather than fact stating. Rationality constraints on the cotenability of norms and factual claims ground a factual-normative worlds semantics that, in addition to solving the Frege-Geach problem, delivers the intuitive set of possibilia for each chance law. Hence, the account does not entail HS, and the paradox does not arise. A confirmation theory is developed, and the Principal Principle is justified. Copyright 2005 by the Philosophy of Science Association. All rights reserved.
Article
This the first part of a two-part article in which we defend the thesis of Humean Supervenience about Laws of Nature (HS). According to this thesis, two possible worlds cannot differ on what is a law of nature unless they also differ on the Humean base. the Humean base is easy to characterize intuitively, but there is no consensus on how, precisely, it should be defined. Here in Part I, we present and motivate a characterization of the Humean base that, we argue, enables HS to capture what is really stake in the debate, without taking on extraneous commitments. “I tend to picture the [facts of the form “it is a law that s” and “is is not a lw that s”] as having been sprinkled been sprinkled like powdered sugar over the doughy surface of the non-nomic facts.”—Marc Lange “Avoid empty carbohydrates.”—Runner's World
Article
I sketch a new constraint on chance, which connects chance ascriptions closely with ascriptions of ability, and more specifically with ‘can’-claims. This connection between chance and ability has some claim to be a platitude; moreover, it exposes the debate over deterministic chance to the extensive literature on (in)compatibilism about free will. The upshot is that a prima facie case for the tenability of deterministic chance can be made. But the main thrust of the paper is to draw attention to the connection between the truth conditions of sentences involving ‘can’ and ‘chance’, and argue for the context sensitivity of each term. Awareness of this context sensitivity has consequences for the evaluation of particular philosophical arguments for (in)compatibilism when they are presented in particular contexts.
Article
Larry Wright and others have advanced causal accounts of functional explanation, designed to alleviate fears about the legitimacy of such explanations. These analyses take functional explanations to describe second order causal relations. These second order relations are conceptually puzzling. I present an account of second order causation from within the framework of Eells' probabilistic theory of causation; the account makes use of the population-relativity of causation that is built into this theory.
Article
I follow Hájek (Synthese 137:273–323, 2003c) by taking objective probability to be a function of two propositional arguments—that is, I take conditional probability as primitive. Writing the objective probability of q given r as P(q, r), I argue that r may be chosen to provide less than a complete and exact description of the world’s history or of its state at any time. It follows that nontrivial objective probabilities are possible in deterministic worlds and about the past. A very simple chance–credence relation is also then natural, namely that reasonable credence equals objective probability. In other words, we should set our actual credence in a proposition equal to the proposition’s objective probability conditional on available background information. One advantage of that approach is that the background information is not subject to an admissibility requirement, as it is in standard formulations of the Principal Principle. Another advantage is that the “undermining” usually thought to follow from Humean supervenience can be avoided. Taking objective probability to be a two-argument function is not merely a technical matter, but provides us with vital flexibility in addressing significant philosophical issues.
Article
David Lewis’s ‘Humean Supervenience’ (henceforth ‘HS’) combines realism about laws, chances, and dispositions with a sparse ontology according to which everything supervenes on the overall spatiotemporal distribution of non-dispositional properties (Lewis 1986a, Philosophical papers: Volume II, pp. ix–xvii, New York: Oxford Univesity Press, 1994, Mind 103:473–490). HS faces a serious problem—a “big bad bug” (Lewis 1986a, p. xiv): it contradicts the Principal Principle, a seemingly obvious norm of rational credence. Two authors have tried to rescue Lewis’s ontology from the ‘big bad bug’ (henceforth ‘the Bug’) by rejecting realism about laws, chances, and dispositions (Halpin 1994, Aust J Phil 72:317–338, 1998, Phil Sci 65:349–360; Ward 2005, Phil Sci 71:241–261). I will argue that this strategy cannot possibly work: it is the ontology, not the realist thesis, that lies at the root of the problem.
Article
Can there be deterministic chance? That is, can there be objective chance values other than 0 or 1, in a deterministic world? I will argue that the answer is no . In a deterministic world, the only function that can play the role of chance is one that outputs just 0s and 1s. The role of chance involves connections from chance to credence, possibility, time, intrinsicness, lawhood, and causation. These connections do not allow for deterministic chance. 1 Overview 2 Four Arguments for Deterministic Chance 3 Four Conceptions of Deterministic Chance 4 The Role of Chance 5 The Case against Posterior Deterministic Chance 6 The Case against Initial Deterministic Chance 7 Epistemic Chance
ResearchGate has not been able to resolve any references for this publication.