Synthese (2008) 160:97–101 DOI 10.1007/s11229-006-9102-4 O R I G I NA L PA P E R

Horgan on Sleeping Beauty Joel Pust

Received: 6 April 2006 / Accepted: 24 July 2006 / Published online: 6 October 2006 © Springer Science+Business Media B.V. 2006

Abstract With the notable exception of David Lewis, most of those writing on the Sleeping Beauty problem have argued that 1/3 is the correct answer. Terence Horgan has provided the clearest account of why, contrary to Lewis, Beauty has evidence against the proposition that the coin comes up heads when she awakens on Monday. In this paper, I argue that Horgan’s proposal fails because it neglects important facts about epistemic probability. Keywords Sleeping Beauty problem · Epistemic probability · Terence Horgan

1 Sleeping Beauty is, as many have now heard, the subject of an odd experiment. The experimenters are to render Beauty unconscious on Sunday night. She will remain that way until they awaken her on Monday morning. A few minutes later they will tell her that it is Monday, put her back to sleep and administer a drug that erases her memories of Monday. If the flip of a fair coin comes up heads on Monday night, she will remain unconscious until Wednesday. If it comes up tails, the experimenters will awaken her on Tuesday and a few minutes later put her back to sleep again so that she remains unconscious until Wednesday. When Beauty awakens on Wednesday, they will inform her that the experiment is over. Beauty is a paradigm of rationality. And, importantly, she is certain that the experiment will go exactly as planned. Thus, her credence in HEADS (the hypothesis that the coin comes up heads) is 1/2 on Sunday. But what should her credence in HEADS be when she is first awakened on Monday? Many philosophers have (following Elga

J. Pust (B) Department of Philosophy, University of Delaware, Newark, DE 19716-2567, USA e-mail: [email protected]

98

Synthese (2008) 160:97–101

(2000)) claimed that her credence should be 1/3. Lewis (2001) disagrees, maintaining that Beauty’s credence in HEADS should remain unchanged at 1/2. Lewis argues as follows: (A)

(B) (C) (D)

Absent a relevant cognitive mishap, only new information evidentially relevant to a hypothesis should produce a change in credence in that hypothesis. Beauty receives no new information evidentially relevant to HEADS when she is awakened on Monday. Beauty suffers no relevant cognitive mishap in the period up through her Monday awakening. Thus, Beauty’s credence in HEADS should remain 1/2.

Horgan (2004) provides an argument for the 1/3 answer which attempts to make clear exactly why, contrary to Lewis’ (B), Beauty does gain information which is evidentially relevant to HEADS upon her Monday awakening. Indeed, Horgan apparently thinks the rest of Lewis’ argument unassailable as he claims that “if such an account cannot be given, then it would seem that Lewis is right after all” (2004, p. 11) even though this would, Horgan believes, require endorsing Lewis’ highly counterintuitive further claim that after Beauty is told that it is Monday (and hence knows that the toss of the fair coin has yet to occur), she ought to have a 2/3 credence in HEADS. In this paper, I show that Horgan’s argument fails.1

2 Horgan begins by distinguishing four hypotheses for Beauty to entertain upon her Monday awakening: H1: HEADS and today is Monday. H2: HEADS and today is Tuesday. T1: TAILS and today is Monday. T2: TAILS and today is Tuesday. He argues that, upon awakening on Monday, Beauty should change her credence in HEADS to 1/3 by updating her “preliminary probabilities” for H1, H2, T1 and T2 respectively in light of coming to know, ‘I am awakened today by the experimenters.’ He departs from standard Bayesian updating inasmuch as these preliminary probabilities are not prior probabilities, i.e., they are not the probabilities of H1, H2, T1 and T2 relative to the information Beauty possessed prior to being awakened on Monday (i.e. on Sunday). Instead they are her [Monday] probabilities relative to “a portion of [her] total current information [i.e. the information she has when she is awake on Monday] comprising not only her prior information about the experiment but also her current, disjunctive, information about what the current day is . . .” (20, emphasis added). According to Horgan, the preliminary probability of each of the four hypotheses is 1/4. For given the information that today is Monday or Tuesday (in addition to background information about the experiment), one of the four hypotheses must be 1

Dorr (2002) and Arntzenius (2003) also argue for 1/3 by appeal to allegedly analogous cases but are, I think, successfully rebutted by Bradley (2003).

Synthese (2008) 160:97–101

99

true; and Horgan proposes that given her total evidence other than the knowledge, ‘I am awakened today by the experimenters,’ the four hypotheses are equally probable. If this is right, then Beauty’s preliminary probability for HEADS is 1/2. Horgan goes on to suggest that her preliminary conditional probability for H2 given the statement ‘I am awakened today by the experimenters’ is zero and therefore, the alternatives to H2 being equally probable, her preliminary conditional probability for each of H1, T1 and T2 given this statement is 1/3. It follows that her preliminary conditional probability for HEADS given ‘I am awakened today by the experimenters’ is 1/3 and, taking into consideration her knowledge that she is awakened today, Beauty’s credence in HEADS should be 1/3. Hence, contrary to Lewis, her knowledge that she is awakened today is evidentially relevant to HEADS.

3 Horgan’s proposal is undermined by the following argument: (1) (2)

(3) (4) (5) (6) (7)

An epistemic probability is the degree to which an agent in some logically possible epistemic situation ought (rationally) to believe some statement. Any logically possible agent in any logically possible epistemic situation ought to be absolutely certain that the statement ‘I am conscious now’ is true. Thus, (when she is awake on Monday) Beauty’s preliminary probability for ‘I am conscious now’ is one. [1, 2] Beauty’s preliminary probability for ‘I am conscious now only if I am awakened today by the experimenters’ is one. Thus, Beauty’s preliminary probability for ‘I am awakened today by the experimenters’ is one. [3, 4] Beauty’s preliminary probability for H2 with respect to the statement ‘I am awakened today by the experimenters’ is zero. So, Beauty’s preliminary probability for H2 is zero. [5, 6]

As the inferences in this argument appear unproblematic, I turn to a brief discussion and defense of the premises. Premise 1 is justified by the fact that epistemic probabilities are relative to epistemic situations. Horgan himself points out that “epistemic-probability contexts are intensional,” like belief contexts generally, and that epistemic probability is “essentially tied to available evidence” (2004, p. 13; 2000). It would seem, then, that epistemic probability ought to be understood as rational degree of belief in a given evidential situation. The relevant epistemic situation need not be an actual one, but it must be at least a possible one. Furthermore, though this is not always made explicit in some presentations of the case, Beauty is certain that she will not be conscious on Monday or Tuesday unless she is awakened by the experimenters, that she will be awakened by them on Monday, and that she will not be awake on Tuesday if the coin comes up heads. Hence, premises 4 and 6 are true. Premise 2 follows from the fact that it cannot be rational for any possible agent in any possible epistemic situation to doubt that she is conscious at the very time she considers the matter. This is not to say, of course, that the agent in question cannot rationally be uncertain about the nature of the self or of consciousness or of time. It may be objected that there are possible rational agents who are not conscious or who

100

Synthese (2008) 160:97–101

lack the conceptual resources required to consider whether they are conscious.2 This objection, however, could be easily evaded by amending the premise in question so as to apply only to agents who are conscious and possess the relevant concepts. The amended premise would hold that any logically possible conscious agent (capable of considering whether she is conscious) in any logically possible epistemic situation ought to be absolutely certain that she is presently conscious. As Beauty certainly is conscious and possesses the relevant concepts, premise 3 will follow and the rest of the argument will proceed unchanged.

4 Horgan’s proposal can be seen as a response to the following facts about Beauty’s situation. First, relative to Beauty’s total evidence when she is awake on Sunday, H1, H2, T1 and T2 have no epistemic probability at all because, as Horgan notes, “the irreducibly indexical statement-partition [of H1, H2, T1, and T2 has] not yet arisen” (15). There seems, as well, no prior Sunday probability for the Monday statement ‘I am awakened today by the experimenters.’ Second, relative to Beauty’s total evidence immediately upon her Monday awakening, H2 has an epistemic probability of zero and H1, T1 and T2 each have some positive epistemic probability. Given these facts, no sense can be made of Beauty updating (on Monday) her actual prior (Sunday) probabilities for the four hypotheses in light of her coming to know (on Monday) ‘I am awakened today by the experimenters.’ Horgan’s suggestion is, in effect, that we should determine what credence Beauty ought to have in each of the four hypotheses upon awakening on Monday by considering what credence she ought to have in each hypothesis were she to (1) have all the evidence she actually has on Monday except for the evidence inconsistent with H2 and then (2) conditionalize upon that evidence. The difficulty with this suggestion is that the knowledge Beauty has which is inconsistent with H2 is her knowledge that she is awakened today by the experimenters. However, given her stipulated certainty regarding the conditions of the experiment, she could not lack this knowledge unless she lacked knowledge that she is presently conscious. So, Horgan’s proposal must appeal to epistemic probabilities relative to an epistemic situation in which Beauty [a] has a positive epistemic probability for each of the relevant statements and [b] has certain knowledge of the experimental protocol but [c] lacks knowledge that she is presently conscious. However, such an epistemic situation is impossible.3 Were Beauty unconscious, she wouldn’t be in an epistemic situation at all and no statement, especially an indexical one regarding her present consciousness, would have any epistemic probability for her. Were she conscious and capable of grasping the relevant statements, she would (being perfectly rational) know that she was conscious and, given her background knowledge, be certain that H2 is false. Hence, Horgan’s proposal fails. As the reasons for the failure suggest that no similar proposal could 2 I owe this objection to an anonymous referee. 3 Something like Horgan’s preliminary probabilities seem to me an appropriate way to deal with the

“problem of old evidence” said to afflict Bayesianism (Glymour, 1980). However, if my argument is sound, an appeal to preliminary probabilities cannot allow one’s knowledge that one is conscious to serve as evidence for a Bayesian. In a longer paper, I defend this claim with additional arguments and discuss its broader significance.

Synthese (2008) 160:97–101

101

succeed, it has yet to be shown what is wrong with Lewis’ plausible argument for 1/2 and the Sleeping Beauty problem remains deeply puzzling.4

References Arntzenius, F. (2003). Some problems for conditionalization and reflection. Journal of Philosophy, 100, 356–370. Bradley, D. (2003). Sleeping Beauty: A note on Dorr’s argument for 1/3. Analysis, 63, 266–268. Dorr, C. (2002). Sleeping Beauty: In defense of Elga. Analysis, 62, 292–296. Elga, A. (2000). Self-locating belief and the Sleeping Beauty problem. Analysis, 60, 143–147. Glymour, C. (1980). Theory and evidence. Princeton: Princeton University Press. Horgan, T. (2000). The Two-Envelope paradox, nonstandard expected utility, and the intensionality of probability. Nôus, 34, 578–603. Horgan, T. (2004). Sleeping Beauty awakened: New odds at the dawn of the new day. Analysis, 64, 10–21. Lewis, D. (2001). Sleeping Beauty: Reply to Elga. Analysis, 61, 171–176.

4 Thanks to Terry Horgan for helpful comments on previous drafts of this paper. I owe a special debt

to Kai Draper for substantial contributions to this paper.