There are three balls. One is red. Each of the others is either white or black. Now I give you a choice between two lotteries.

Lottery A: You win a prize if we draw a red ball.
Lottery B: You win a prize if we draw a white ball.

Which lottery do you choose?

Now I give you another choice between two lotteries.

Lottery C: You win a prize if we draw a ball that's not red.
Lottery D: You win a prize if we draw a ball that's not white.

Which lottery do you choose?

Post your answers, plus any reasoning, in the comments. If you're already familiar with the Ellsberg paradox, you can just watch. Explanations to come later.

UPDATE: Glad this is getting so many comments. Just a few comments of my own:

(1) Many people are assuming that each of the two balls is white or black with a 50-50 probability. Maybe, maybe not. Just keep in mind that it's not part of the assumptions.

(2) Just in case you reject the problem because you don't know the probabilities of white vs. black (though you shouldn't), you can answer the question assuming there's a 50-50 probability. Then, just for fun, answer the problem again where white has a 49% chance.

(3) Also, some people are wondering about the motivations of the "house," i.e., whether it wants you to win or lose. Think what you like about the motivations of the house, but keep in mind that the colors of the balls (however determined) are the same in Part 1 and Part 2.

(4) Some of you are wondering what's the "paradox." I'll explain soon (or you can just look it up on Wikipedia). It may not be right to call it a paradox; perhaps it's just an illustration of an interesting aspect of how people make choices.

Sarah (mail) (www):
I'd choose lottery A (I like the surety; there's always a red ball, and so always a 1/3 chance -- lottery B might work out better for me in the long run depending on how many actually are white, but the potential 0/3 odds would drive me nuts, and since you're dealing with either 0/3, 1/3 or 2/3, it seems to me it'd even out to 1/3 anyway.)

And I'd choose lottery C, since there are always two balls that aren't red, and thus constant 2/3 odds. I might have a shot at 3/3 odds in Lottery D, if both non-red balls turn out to be black, but again, the potential of 0/3 odds would drive me nuts, and it seems like I'd still have an average of 1/3 odds, with the white/black option on the other two balls.

(I'm sure I've missed something, probably something huge, but since it's given me an excuse to not do my Russian homework for a few minutes, I don't care.)
10.11.2006 12:08am
Ok I'll bite...

Are we assuming that only one ball is drawn?

I choose lottery A. A gives me a 1:3 chance of winning.
B could either give me a 0:3, 1:3, or 2:3 chance of winning, because those last two balls could be BB, BW/WB, or WW, respectively

I suppose technically overall the odds are the same for each lottery, but I'd rather have a sure chance.

In the second I'd choose lottery D. D would either give me 1:3, 2:3 or 3:3, if the other two balls are WW, WB/BW, or BB, respectively.

C gives me a 2:3 chance of winning.
10.11.2006 12:09am
Sarah, Lottery D doesn't have a 0:3 option. One ball is always red, which will satisfy the win condition
10.11.2006 12:09am
I forget if WB and BW are counted as distinct or the same in this kind of probability exercise.
10.11.2006 12:11am
First question: A. There's definitely one red ball. The chances of picking red is 1/3, and it's possible that the other two balls could both be black.

Second question: C. Again, the chances of a non-red ball are 2/3, where if the two other balls are white, there's only a 1/3 chance of drawing non-white. Obivously, those odds change if one of the other balls or both is white, but we don't know from the hypo.
10.11.2006 12:12am
I guess I should explain why I chose D better... I skipped on B because of the possibility of having a 0% chance of winning. Neither C nor D has a 0% possibility scenario, so I'd go with D for the possibility of having a 100% possibility.

Silly, I suppose.

But... I'm sure we're missing something, or this wouldn't be up here :) Trying to think what it could be... what's the paradox?
10.11.2006 12:17am
Justin (mail):
Lottery A: 1/3
Lottery B: 2/3 * 1/2 = 1/3
Lottery C: 2/3
Lottery D:1/3 + 1/2 = 3/6 + 1/6 = 2/3

The assumption here, of course, is that white and black have a random, 50% chance of occurance. Assuming that the lottery is ran by someone who wants others to lose, always go with the red ball.
10.11.2006 12:21am
Tony2 (mail):
The thing that jumps out to me is the unknown probability of a ball being white or black. If there were a random step that replaced lottery B with "we will flip a coin, and if it's heads you will win if the ball is white, and if it's tails you will win if the ball is black" then lotteries A and B would be equally appealing.

Otherwise I think there is a strong reason to believe that given the choice between lotteries A and B, the person running the game doesn't actually have ANY white balls in the mix. What motivation would there be to include those white balls? So when you include the motivation of the game controller, the paradox doesn't seem like much of a paradox at all.
10.11.2006 12:21am
paranoidblogger (mail) (www):
A on the first one, but it doesn't seem like it would matter. It's the safer bet if there are no white balls, but it'd seem there's a 1/3 chance of the red or white balls.

C on the second one, but again it doesn't seem to matter. D is a sure winner if the other balls are black and a 1/3 winner if the other balls are red. C is 2/3 both times. On average it'd seem there's a 2/3 chance of winning either lottery.

Of course I'm not particularly good at statistics so the above might be completely off.
10.11.2006 12:25am
Tony2 (mail):
FWIW, having looked at the Wikipedia entry, I think you've bungled the story a bit, and that the situation you describe doesn't describe the paradox very well. I shall await the followup entry...
10.11.2006 12:28am
Tony I don't think the motivation of the controller matters unless we know what the payouts are.
10.11.2006 12:29am
Niiiiiiiiiiiiiiiiiiiiick (www):
I am posting this without having read anyone else's answers - I'm sure that I'll be wrong, heh.

I would pick Lottery A because there's a 1/3 chance that I'll get something, while with Lottery B it could be 2/3 - but it could also be 1/3 (no difference) or 0/3 (far worse). Lottery A gives me a consistent chance do get something.

I would choose Lottery C for effectively the same reason. I've got a 2/3 chance with Lottery C, and while I might get a 3/3 chance with Lottery D, I might also get 2/3 (no difference) or a 1/3 (worse).

Color me risk averse. :)
10.11.2006 12:29am
Robert Corr (mail):
The possible combinations are:


The probability of drawing any colour is 4/12. That means there is no difference between lotteries A and B, or between lotteries C and D.

How is it a paradox, though?
10.11.2006 12:33am
The only pertinent difference between this and the version shown on wikipedia (ahhh google and wikipedia... making geniouses of us all) is that they seemed to have switched C and D from the original. And I'm one of the unusual ones hehe
10.11.2006 12:35am
Bryan (mail):
Instinctively I went with A &D, but I can see that the real gamble is on what proportion of balls will be white. In the first lottery I picked the known risk (A=1/3) over the unknown risk (B 0 to 2/3). In the second lottery I picked the bet with a chance for 3/3 odds, despite no information on the chances of this mix.

In the long run with random mixes (equal chances) of black and white balls, both A &B and also C &D should have equal returns.
10.11.2006 12:37am
Looking at these theories that this example (and another one) are supposed to run counter too... I think the theories are assuming that people are good at math and can figure out the probabilities on their own lol.
10.11.2006 12:47am
Z Jones (mail):
Most of you seem to have what appears to be the correct answer; that is, Lotteries A and B each have 1/3 odds, and Lotteries C and D each have 2/3 odds.

What some of you seem to be making a mistake on, though (correct me if I'm wrong), is that it is somehow a "sure thing" or a "safe bet" to take Lottery A over Lottery B, and Lottery C over Lottery D. The odds for Lottery A are 1/3, and the odds of Lottery B are ALSO 1/3. The odds of Lottery C are 2/3, and the odds of Lottery D are ALSO 2/3. It does not matter in Lotteries B and D that you can determine three different scenarios of ball combinations: the odds of winning these lotters is the exact same as A and C.
10.11.2006 12:51am
Gil Milbauer (mail) (www):
Ok, I can see that choosing A and C seems inconsistent with respect to one's estimate of the probability of a ball being white.

But, since we don't know the distribution of black and white balls, we can't estimate the probability at all. So, I don't see any good reason to choose B or D when you can choose A and C and know what your chances are. So, I don't think it's inconsistent; you can't really have an estimate. You just have an unknown in each case. Why not avoid it when it doesn't cost you?

Perhaps if the prizes for A and C were smaller than those for B and D it would show something interesting.

I once took an exam with a two part question, both of which were affected by a binary fact that I didn't know. I answered them inconsistently (knowing I'd get one part wrong) to avoid getting both parts wrong. I think that was rational.

Of course, my friend Haym faced the same situation, guessed, and got them both right.
10.11.2006 12:52am
Mike BUSL07 (mail) (www):
Don't each of the sets of lotteries provide identical returns? And what's this Ellsberh paradox?
10.11.2006 12:53am
PaulV (mail):
I do not trust the house to be honest. Go with red and you can't be cheated
10.11.2006 12:53am
Other correspondents have correctly identified the situation:

0 <= PB <= 2/3
1/3 <= PD <= 1
with <PA>=<PB>=1/3 , <PC>=<PD>=2/3

I'll just point out that there are at least two pieces of information that would rationally affect my preference among scenarios with equal expected outcomes (which is the supposed paradox): how do I perceive the prize, and how do I perceive you (or whoever is running the lottery).

1) Is it a prize I really, really want? "Ambiguity aversion" describes it well: in the absence of further information, all expectation values are the same, but if I'm starving and the prize is a sandwich, I might prefer the less ambiguous course.

2) Are you fair? Unfair? Generous? Stingy? Perverse? Do you know what balls are in the jar, or care whether I win or lose?
10.11.2006 12:54am
Gil Milbauer (mail) (www):
Sorry, I blew the story.

He flipped a coin.,
10.11.2006 12:55am
Midatlantan (mail):
A and C seem to be leading the pack, which makes the "paradox" less apparent, it would seem. I.e. the rather limited sample of commenters here seem to be reflecting a somewhat consistent preference for "known unknowns" over "unknown unknowns" (as Rumsfeld might put it). A and C were my first picks, too.
10.11.2006 12:56am
roy (mail) (www):
My math agrees with others above, A &B have equal probability of a prize, as do C &D.

So I pick A, prize on a red ball, because red is pretty. And I pick D, prize on a non-white ball, because I'm afraid I'll get called a racist if I don't.
10.11.2006 1:04am
Mike BUSL07 (mail) (www):
Roy wins the thread.
10.11.2006 1:07am
Mike Keenan:
Mathematically, it doesn't seem to matter.

But, I would have a mild preference for A and a strong preference for C. I like the "sure thing" in A. And I don't like the possibility of 2 whites in D. Silly reasoning of course since it doesn't matter.
10.11.2006 1:09am
Thanks for keeping me up. I was just about to go to bed, but I can't resist these things.

I'll assume that there's an equal chance of getting white balls and black balls in my lottery. Even if there is no guarantee that the white balls and black balls will be drawn randomly from a set containing an equal number of both, I figure there is still an even chance of white and black balls in my lottery so long as the relative percentages in the set from which they're drawn is left completely to chance.

I can't see a difference between lotteries A and B. If I take Lottery A, I have a 1/3 chance of winning. As for Lottery B, I have a 1/3 chance of winning if there is one of each, or a 2/3 chance of winning if there are two whites. Assuming the chances of two whites, two blacks, or one of each is equal, my chances of getting each combination are as follows: (a) I have a 1/3 chance of two whites, which means a 1/3 chance of a 2/3 chance of winning [1/3 chance of combination x 2/3 chance of winning = 2/9 chance of getting this combination and winning]; (b) a 1/3 chance of two blacks, which means a 1/3 chance of a 0/3 chance of winning [0/9]; and (c) a 1/3 chance of getting one of each, which combination yields a 1/3 chance of winning [1/9]. Combined [2/9 + 0/9 + 1/9], that's a 1/3 chance of winning Lottery B - same as Lottery A.

As for lotteries C and D, I again don't see a difference. With Lottery C, I clearly have a 2/3 chance of winning. In Lottery D, I have a 1/3 chance of two blacks [1/3 chance of a 3/3 chance = total 1/3 chance of that combination and winning], a 1/3 chance of two whites [1/3 chance of 1/3 chance = 1/9 chance], and a 1/3 chance of a white and a black [1/3 chance of a 2/3 chance = 2/9 chance of winning]. That's also a total 2/3 chance of winning for Lottery D.

Lottery A: 1/3 chance
Lottery B: 1/3 chance
Lottery C: 2/3 chance
Lottery D: 2/3 chance

If they're all the same as I have concluded, I'd go with A and D for my own psychological comfort, because I know there is a red ball in there and definitely have a chance in both lotteries A and D regardless of the white/black combination.

I probably completely screwed this up, so I'll be interested to see the explanation.
10.11.2006 1:09am
I was an English major, so I would go for A because red is my favorite color.
10.11.2006 1:10am
Sasha Volokh (mail) (www):
I've posted an update.
10.11.2006 1:12am
In the time it took me to type my post, others responded and it appears most of us are basically in agreement. I took a look at Wikipedia, and I don't think the story has been changed substantially.

I don't think we are supposed to assume that the game is run by a person with any motivation.
10.11.2006 1:17am
Z Jones: "The odds for Lottery A are 1/3, and the odds of Lottery B are ALSO 1/3."

Not true. The odds of B are either 0, 1/3, or 2/3. In the absence of further information, you do not know which.

Now, it is true that ITAOFI the average of these odds is 1/3, but there could exist further info that would skew my perception, e.g.:

- If I know that the house is a stingy jerk, then I might reason that the odds of winning B are zero.
- If I know that he's fair and disinterested, then I might reason that the odds of winning B are 1/3.
- If I know that he's a generous pal, then I might reason that the odds of winning B are 2/3.
10.11.2006 1:19am
roy (mail) (www):
Re: the probability of black vs. white, the fact that we don't know their distribution is pretty much equivalent to them having equal probability. Unless we're talking quantum mechanics, probabilty is just calculated ignorance.
10.11.2006 1:26am
plunge (mail):
Thinking that "it could be either A or B" equals "the chance is 50/50 of it being one or the other" is a pretty basic mistake. If you don't know what determines the odds, or if you can't statiscally test the odds, then you can't determine the odds. This doesn't have any direct bearing on this issue though, because the point of it is that the unknown odds are the same between the two gambles, not any debate as to what the odds might be.
10.11.2006 1:53am
Alex R:
There are a few interesting things about this puzzle that haven't seemed to come up yet...

(1) When the first choice of lotteries is offered, we don't know yet that there will be a second one. So we might say that the "house" could have stacked the pot with black balls -- better to take lottery (A). But when the second choice of lotteries is offered, we realize that if the house had stacked the pot with black balls, we would have a sure win now. So if the house were stacking the pot for the least payoff for the two choices, it probably should stack the pot with *white* balls, and hope we take lottery (A) on the first option!

(2) I haven't seen anyone mention that on the first draw, you *get to see a ball*. If the ball you draw is black, then lottery (D) will have odds of either 2/3 or 3/3, which is guaranteed to be as good as or better than (C). If you draw a white ball on the first draw, then lottery (D) will have odds of 1/3 or 2/3 -- so (C) is sure to be at least as good, if not better. (Drawing a red ball on the first draw doesn't help you at all in deciding what to do on the next lottery.)
10.11.2006 2:20am
Scote (mail):
The answer is null set. The question is incomplete. At no point are the buy in or pay out for the scenarios stated so no choice is any better or worse than any other. The odds of winning are irrelevant without knowing how the odds relate to the monetary proposition.
10.11.2006 2:32am
Mike BUSL07 (mail) (www):
Scote, that's not exactly true. We are told from the get go that this is an Ellsberg paradox. That means that under all circumstances, the value of playing each lottery is the same. So the buy-in is the same for each set, and the prize is the same too. But yeah, you apparently know more about math than me.
10.11.2006 3:03am
If I were a member of Planned Parenthood, I would pick D and C, but I am not.

I would choose B and D because I am a masochist, but I don't want to be accused of being a commie, so I pick B and C.
10.11.2006 3:34am
Avatar (mail):
People tend to be risk-averse in situations where they don't have complete information because that's a survival trait in society. There are an AWFUL lot of people who make a living based on other people's inability to calculate odds, and more from people's trust that they've been told the truth when in fact they've been lied to. When you hear something that sounds like a good offer, but without adequate explanation, you tend to distrust it - especially if it's of the type "give me something now and I will give you something later".

The rational person thus hears the question and -assumes- that the incomplete information is of the type that would screw him... i.e. the lotto operator can claim that he never SAID the white and black balls were present in equal proportions, or indeed that both would be present at all, so he was playing fair even though he misrepresented your chances of winning. Even if this isn't the case, if we weren't talking about a pure logic problem, it would be imperative to assume that some sort of misrepresentation was in fact occurring.

That's one reason that legal language is the way it is - if something is spelled out exactly, in terms both sides agree on, nobody comes back later and says "oh, we have all your money and we never actually said we'd do anything for you, now I'm going on vacation."

So thanks, I'll stick with my cynicism as such things are concerned. Safer!
10.11.2006 3:45am
plunge (mail):
Scote, the payoff and buy in don't matter. It doesn't affect the question at all. You must choose one or the other, and you want to win, but aside from that, how much makes no different.
10.11.2006 4:20am
Tennessean (mail):
Anyone claiming that we don't know the odds of certain events needs to read Roy's comment again.
10.11.2006 4:43am
plunge (mail):
Roy is wrong. Not knowing if A or B is true is NOT the same thing as knowing that they are equally likely. It just means that we don't know the probability. A binary proposition is not an automatic 50/50 chance.

You cannot derive knowledge from a lack of knowledge.

The actual odds of the balls is actually as irrelevant as the payoff to the question of paradox.
10.11.2006 4:52am
Glen Whitman (mail) (www):
You should prefer A to B <i>if and only if</i> you prefer D to C. This can be proven for any assumption you make about how the 2nd and 3rd balls are chosen.


A. P(R) = 1/3
B. P(W) = ?
C. P(~R) = 2/3
D. P(~W) = ?

Assume you think A gives you better odds than B.
Then you must think 1/3 > P(W).
Therefore P(~W) > 2/3.
Therefore P(~W) > P(~R).
Therefore D gives better odds than C.

Reverse the signs for the rest of the proof.

I don't think there's any logical paradox in the problem itself. The paradox (if you want to call it that) only emerges if there are people who choose inconsistently — e.g., choosing A and C. Even that's not really inconsistent if they assumed that P(W) = 1/3, which is neither supported nor contradicted by the instructions.
10.11.2006 5:06am
Siona Sthrunch (mail):
I disagree with Whitman's analysis for the following reason. The flaw he makes, and the flaw most posters make, is to assume that the chooser's prior distribution of the probabilities that the other two balls are black or white is independent of the fact that an offer was made. However, the fact that the offeror actually made the second offer can change the chooser's belief as to the likelihood of a particular ball color distribution. Therefore, in the poster's presentation, there is no paradox.

Instead, the paradox should be presented in some way that fixes the prior distributions. For example, lottery options 1 and 2 are each presented to the chooser, and a coin flip decides which option is actually offered.
10.11.2006 5:49am
plunge (mail):
If you think that P(W) = 1/3 then you should be indifferent to all the choices rather than preferring any, no? Of course, the question gives you no reason to assume that regardless, and I think the actual number of people that are indifferent in practice is probably rare.
10.11.2006 5:50am
A. Zarkov (mail):
"If I had them I'd be King."
10.11.2006 5:51am
plunge (mail):
Siona, motives of the person offering the choices aren't offerred as part of the description. You might as likely assume that they want you to win as not and you can't be sure what they want, and therefore how they would rig the distribution for or against you based on any offerred option.
10.11.2006 5:54am
John Jenkins (mail):
I don't think some of these sets of assumptions are correct. You know there is one red ball, but you don't know whether there is 0, 1 or 2 white or black balls.

Option A always gives you a 1:3 chance to win.
Otion B could give you a 0:3 (no white); 1:3 (one white; or 2:3 chance of winning (2 white). Assuming, ex ante that there is an equal opportunity for 0, 1 or 2 white balls, drawing one ball will give you a chance to win 3 of 9 times, or 1/3. I don't think there's a difference between A or B, so the coin flip is as good as any.

Option C always gives you a 2:3 chance to win.
Option D gives you a 1:3 chance if there are 2 white balls, a 2:3 chance if there is one white ball, and a 3:3 chance if there is no white ball present. Again, assuming ex ante there is an equal chance of all three outcomes, that's a 6:9 chance to win, which is equivalent to 2:3. You'd prefer the second set of lotteries to the first, on just a pure chance of winning basis, but I don't see a way to discern between the two.

I don't think the motivations of the house are relevant unless they're outright cheating (e.g. looking and drawing what they want to, or depositing balls after you make your selection). Anything short of that makes their motivations irrelevant because they can't change the probabilities once the balls are in the hat, as it were.

I'd guess people would be expected to choose A &C because of the concept of a sure thing (always a chance, and the same chance, to win), but I don't think it matters. Now I'll go read wikipedia to see what they have to say.
10.11.2006 7:42am
plunge (mail):
"I don't think there's a difference between A or B, so the coin flip is as good as any."

But you can't assume anything. There well could be a difference between A or B: you just don't know if there is or not, which is NOT the same thing as knowing that there is no difference. At best, the information you have suggests that you have no basis to judge the probability at all, not that you can judge it to be as likely one thing as the other.
10.11.2006 7:57am
John Jenkins (mail):
Plunge, you're just assuming there is a difference. Statistically, there isn't, unless the other side is cheating, which is not part of the problem. Arguing with the question doesn't get you anywhere.

As to being unable to judge the probability, I think I already did that, you can agree or not as you like. As a matter of pure probability, your chance of winning with a is 1 out of 3, and your chance of winning with be is an equal probability of 0, 1 and 2 out of three, which comes up to... 1 out of 3 (again, excepting cheating, which is not part of the problem as presented).
10.11.2006 9:02am
For the first choice I go with the red ball which gives me a 1 in 3 chance of winning, regardless of the chances of a white ball. If we assume a 50 50 distribution between white and black it is still a 1 in 3 chance, if I went with the white ball but there is no guarantee of a 50 50 shot. So I minimize my risk of failure to go with a known quantity.

For the second lottery I again go with the red ball, this gives me a 2 in 3 chance of winning. This exactly matches what would be expected of a 50 50 split between white and black and again minimizes my risk of failure.

50/50 chance per ball independently determined gives 4 equal possibilities for the three balls.

1) Red White White
2) Red White Black
3) Red Black White
4) Red Black Black

Probability of drawing a white ball is thus for 50 50 independent determination (.25)(.666) + (.25)(.333) + (.25)(.333) + 0 = .333

So by focusing no the red ball I maintain the same odds as a 50 50 independent split, and take the house motives (which I always assume to be against me) out of play.
10.11.2006 9:04am
plunge (mail):
"Plunge, you're just assuming there is a difference."

No, I'm saying you can't assume there isn't. That's not the same thing.

"Statistically, there isn't, unless the other side is cheating, which is not part of the problem. Arguing with the question doesn't get you anywhere."

I'm not arguing with the question. The question is as stated. And actually, the point of the question holds completely regardless of the probability of anything anyway.

"As to being unable to judge the probability, I think I already did that, you can agree or not as you like."

Given that I think you are wrong, I will disagree, ok? :)

"As a matter of pure probability, your chance of winning with a is 1 out of 3, and your chance of winning with be is an equal probability of 0, 1 and 2 out of three, which comes up to... 1 out of 3 (again, excepting cheating, which is not part of the problem as presented)."

I don't know why you think cheating would need to be involved. The only restriction on the people setting this up is that the vat you draw the balls from has the same constitution on the first choice as it does on the second. Other than that, they can pick any combination of balls they want within the rules.

And if it is a "matter of pure probability" then lets see some pure math to prove it. How do you get from "there are three posibilities, and we don't know the probability of any of them... therefore the probability is the average of them all"??? How are you calculating something from nothing?

The answer is that you aren't. In fact, the colors of the balls aren't subject to probability at all: they are chosen beforehand. Their probability at that point is equal to 1 (certainty). The only probability comes from your chances of drawing a ball GIVEN that situation. That you don't fully know what the situation is does NOT transform the actual odds in any way. The fact that there are three posibilities likewise is not a calculation of odds at all.
10.11.2006 9:19am
plunge (mail):
Ed: "So by focusing no the red ball I maintain the same odds as a 50 50 independent split, and take the house motives (which I always assume to be against me) out of play."

What does "against you" mean though? Since the balls are the same in both choices, and the choices opposite, they can't be against you both times in the same way in both games. If there is some meaning to "against you" then you'd be able to use it to increase your odds vs. when they are not against you. It's not clear to me how they could be against you in a general sense.
10.11.2006 9:27am
Lou (mail):
I'm baffled. I did the problem, choose A and D, and then read the Wikipedia writeup and the entire contridiction hangs on an initial assumption that I don't buy. It states that you would prefer A over B if, and only if, you value 1/3 > W. But that's not true. From an expected value standpoint I can be indifferent to A vs. B [as I am in the formulation given by Sasha] and still rationally prefer A to B becuase of the risk profile. C and D are the same. I'm indifferent to the EV, but the risk profile justifies a choice for D. I.e. wikipedia's dramatic 1/3 > W vs. W > 1/3, should be 1/3 >= W and W >= 1/ paradox there.

Now, it is true that by selecting A and D I am selecting two options with opposing risk profiles, but that doesn't seem to be a paradox. It simply states that when the average chance of winning is small people want to avoid the risk of not being in the game at all. But when winning is likely people are willing to risk a lower (but still high) chance to win as a tradeoff for a sure thing. Now, maybe this paragraph is equivalent to a more rigorous treatment of Elsberg's paradox than the Wikipedia writeup, but paradox still seems too strong a term.
10.11.2006 9:36am
The most interesting comments are those that discuss the interplay between the first and second lotteries, namely, whether we are to assume that the same three balls are in use, and when the draw occurs.

The problem does not say that we get to "play" the first lottery before being offered the second, so I think we must assume that we do not. Nor does the problem say that we are starting over with three new balls before the second lottery, so I think it's fair to assume that we are using the same three balls. Since no lotteries have yet been played, I assume that the first time a ball is drawn it will determine the result of both lotteries.

Under those assumptions, not knowing the second lottery would be offered, I would choose A at the first offer, for the reasons stated early on. Sasha may not even have any white balls, so I would take the sure 1/3 chance.

When the second lottery is offered, I can choose C, which guarantees one win, because I will win A if red is chosen and C if red is not chosen, but I am guaranteed only one win - that is, I cannot win both A and C; the ball will be either red or not red. If I choose D, I could potentially win twice. If a red ball is chosen, I would win both A and D. If I was right about Sasha not having any white balls, then I would be guaranteed to win at least D and still have a 1/3 chance of also winning A. Of course, Sasha could have no black balls, in which case I would only have a 1/3 chance of winning, but my return would be double because I would win both A and D. Or it could be somewhere in-between if there is one black and one white ball. Under the stated terms of the problem, there is no way to tell what my odds would be.

Ultimately, I like prizes, and I don't like uncertainty, so I would be satisfied with one guaranteed prize rather than be greedy and take the chance of walking away empty handed.
10.11.2006 9:53am
lucia (mail) (www):
I pick A over B. Later, I pick option C over A,B or D. FantasiaWHT gave most of the reasoning far back, but reading plunge's comment about having no basis for judging probability, I'll elaborate.

Based on the information contained in the problem statement and available to me, I have no reason to expect the black balls predominate over white balls or vice versa. (So, given the condition of my ignorance, the probability a particular non-red ball is black is 50%. This conditional probability is the one I, the decision maker, use. Players with "insider information" should use that information. It doesn't affect my decisions making.)

As everyone has pointed out, if the probability of black and white are equal, the expected value of the payout for case A equals that for case B. B would have a larger standard deviation and I prefer lower standard deviations. Similar reasonsing applies to picking C over D.

Isn't disliking large standard deviations on the rate of return actually some sort of principle in portfolio theory?
10.11.2006 10:05am
Having now read the wikipedia entry, I think that

1. This problem has nothing to do with the Ellsberg "paradox."
2. Ellsberg was an idiot. It makes complete sense that most people would choose a 1/3 chance over an unknown 0-2/3 chance, and that those same people would choose a certain 2/3 chance over an unknown 1/3-1 chance. He assumes that people will determine in advance what they believe the odds to be, rather than (as all of the comments above show) choosing based on the knowledge that one cannot know what the odds are.
10.11.2006 10:13am
plunge (mail):
"So, given the condition of my ignorance, the probability a particular non-red ball is black is 50%."

No, it just isn't. You don't know the probability, end of story. Having two indistinguishable but opposing options tells you nothing useful about them other than if you know the probability of one, you can also know the probability of the other. You don't know the probability of either option. I think more of what you are getting at is that if you flip a coin to determine your guess of black or white, then you have a 50% chance of getting it correct. But that result is trivial: it doesn't constitute any new knowledge about the situation and so cannot help you make a choice.

I mostly agree with DJR as far as I can tell. The best devil's advocate I do against him is that if you you don't know what the odds are for the other balls, then why the strong preference for the red choices in both cases? If you have no solid reason to pick red over white other than its a less ambigious an unknown probability, why not pick white both times?

That is, if the balls are red, white, and black, then the red option is just as good as the white and people should pick it as often as they pick the red. If either the white or the black predominate, however, then whatever you stand to lose in one choice you stand to win back in the other by picking B and D. So why don't an equal number of people pick B and D, since if they are truly indifferent it doesn't really matter? The chances may be unknown, but you DO know that the balls are the same in both rounds, so you can play one choice against the other, and if you prefer the less ambiguous option, then this can't be explained by utility or risk aversion alone.

I'm not sure that convinces me though. I don't know if I followed my own logic, and I'm not sure the math on the payoffs adds up. :)
10.11.2006 10:47am
Guest44 (mail):
Odds of winning

A = 1/3
B = somewhere between 0 and 2/3

C = 2/3
D = somewhere between 1/3 and 1

Can't further define B and D without knowing the white/black proportion.

I prefer A and C. I see from the wikipedia that this is different from most people.
10.11.2006 10:53am
plunge (mail):
Ok, I think I'm more convinced now. If you are indifferent, then you really should be indifferent and that means being as indifferent to A/C vs. B/D as you are between A vs B and C vs. D.

Why? Because the expected payout is the same in both cases. No matter what the odds are, both A/C and B/D represent the situation in which the probabilities in the two cases are just the reverse of each other. This is easier to see if we expand the question from 3 balls to the 90 balls of wikipedia, with 30 red balls, and an unknown number of black and white in the remaining 60. If we choose the white options, then whatever the probability is in choice one is reversed in choice two.

I THINK I've convinced myself that that is correct now. :)
10.11.2006 10:56am
plunge (mail):
Guest: note that in wikipedia, C and D are actually reversed from how Sasha defines the problem. A and C in her telling IS the same as A and D in the wikipedia version. :)
10.11.2006 10:57am
Sasha Volokh (mail) (www):
Plunge: I'm a guy.

All: I've posted a follow-up post.
10.11.2006 11:05am
Sasha Volokh (mail) (www):
Plunge: Illustration.
10.11.2006 11:05am
plunge (mail):
Ah, and I'm very sleepy and not thinking straight. Sorry for that.
10.11.2006 11:17am
To illustrate what Plunge said above, imagine that you have landed on a planet that appears identical to earth but one hundred billion light years away, and you know nothing more about it than what you can observe (i.e., that it appears identical to earth). A stranger walks up and tells you that the planet will either explode in one hour or it won't, and then walks away. Without knowing more, are the odds 50-50 that the planet will explode?

This is no different from Sasha telling you that the two balls are each either white or black. The probability of one or the other is simply unknown to you. The stranger may have been the world's foremost planetary physicist, or he could have been insane, lying, or stupid. You simply have no basis upon which to guess, and so even from your perspective, the odds are not 50%.
10.11.2006 12:24pm
plunge (mail):
As it happens, though, it doesn't matter whether you know the odds or don't know them. Strongly preferring A/C over B/D is irrational in any case, even by the metric of valuing certainty. The payoff of A/C is exactly as certain as the payoff of B/D!
10.11.2006 12:46pm
I was fascinated to learn that the Ellsberg of the Paradox was the same as the Ellsburg of the Pentagon Papers!
10.11.2006 1:27pm
Can't find a good name:
DJR: In your example, does the stranger actually say, "Either this planet will explode in one hour or it won't"? If so, it makes no difference if he is stupid, lying, or crazy, since that's just a tautology. He didn't say that both options have a 50% chance of occuring.
10.12.2006 3:51am