### On Hugh on Two Envelopes, by David Papineau.

On Wednesday Hugh McCormack’s excellent discussion of the two-envelope paradox laid out Sonia’s reasoning as below (if I remember it right).

(For those new to this paradox, Sonia is shown two envelopes, told one contains twice as much money as the other, and is then given one of the envelopes at random, and asked if she wants to swap it for the other. She then reasons . . . )

(1) Let x be the amount in my envelope.

(2) There’s a 50% chance that I’ll lose by swapping and a 50% chance that I’ll win.

(3) If I lose, the other envelope will give me 0.5x. If I win, the other envelope will give me 2x.

(4) The expected result of swapping will thus be 1.25x (50% x 0.5x + 50% x 2x) which is more than x.

(5) So I should swap.

But of course this is a silly conclusion. (If Jim were given the other envelope, he could reason just the same, but there couldn’t be reason for them both to swap.)

Hugh said that Sonia’s calculation must be misapplied because it implies, absurdly, that swapping can lead to an increase in the total money in the envelopes.

That is true enough, but I still hankered to know exactly where the reasoning laid out above goes astray.

I think it’s helpful (as suggested by Hugh in later correspondence) to compare Sonia with Fred. We give Fred an envelope containing a certain amount of money, and then then spin a coin (or something equivalent) to determine whether we put twice or half in the other.

Now Fred can do Sonia’s calculation as above (it’s 50%-50% whether swapping will win or lose, winning yields 2x, losing yields 0.5x . . . so I should swap). But note that in Fred’s case this is a GOOD conclusion. Fred should indeed swap.

So why exactly does Sonia get a bad answer when she does the calculation? After all, it’s equally true of her (since it was random which of the two envelopes she was given) that it’s 50%-50% that swapping will win or lose.

Here's what I would say about the flaw in Sonia's calculation.

(1) Suppose first we understand 'x' as referring fixedly to the actual amount that is in the envelope Sonia (or Fred) is now holding. Then it is NOT automatically true for Sonia (as it is for Fred) that there is a 50-50 chance that she will double or halve THAT amount. (That depends on the probabilistic pattern governing the placing of the amounts in the two envelopes initially presented to Sonia. So, for instance, if the envelope she’s given actually contains a very large amount, towards the upper end of the range of possible amounts, then it's more likely she has 'big' and will lose by swapping--and conversely if her envelope contains an amount towards the bottom end of the range of possible sums.)

(2) What about the TRUTH that Sonia has a 50-50 chance of winning or losing? Well, that's true enough, but we can't plug those 50-50 odds into her calculation. Think of it like this. Her calculation says there are two 50-50 possibilities--she has the big envelope OR she has the small envelope. And the calculation tries to say that in the first possibility swapping will lose 0.5x and in the second swapping will gain x. But 'x' doesn't refer to the SAME number in each of these possibilities. In the first it refers to the amount in the big envelope, in the second it refers to the amount in the small envelope. No wonder this spurious calculation makes swapping seem attractive—it implicitly supposes that the sum in your envelope when you lose will be the SAME as when you win, when in truth it will be twice as big. (Note that for Fred it IS the same amount in his envelope in the two possibilities that he has 'big' and he has 'small'--that's why it is OK for him to do Sonia's calculation using the 50-50 odds.)

(There’s nothing original in the above—the literature says all this and lots more.)

David Papineau

(For those new to this paradox, Sonia is shown two envelopes, told one contains twice as much money as the other, and is then given one of the envelopes at random, and asked if she wants to swap it for the other. She then reasons . . . )

(1) Let x be the amount in my envelope.

(2) There’s a 50% chance that I’ll lose by swapping and a 50% chance that I’ll win.

(3) If I lose, the other envelope will give me 0.5x. If I win, the other envelope will give me 2x.

(4) The expected result of swapping will thus be 1.25x (50% x 0.5x + 50% x 2x) which is more than x.

(5) So I should swap.

But of course this is a silly conclusion. (If Jim were given the other envelope, he could reason just the same, but there couldn’t be reason for them both to swap.)

Hugh said that Sonia’s calculation must be misapplied because it implies, absurdly, that swapping can lead to an increase in the total money in the envelopes.

That is true enough, but I still hankered to know exactly where the reasoning laid out above goes astray.

I think it’s helpful (as suggested by Hugh in later correspondence) to compare Sonia with Fred. We give Fred an envelope containing a certain amount of money, and then then spin a coin (or something equivalent) to determine whether we put twice or half in the other.

Now Fred can do Sonia’s calculation as above (it’s 50%-50% whether swapping will win or lose, winning yields 2x, losing yields 0.5x . . . so I should swap). But note that in Fred’s case this is a GOOD conclusion. Fred should indeed swap.

So why exactly does Sonia get a bad answer when she does the calculation? After all, it’s equally true of her (since it was random which of the two envelopes she was given) that it’s 50%-50% that swapping will win or lose.

Here's what I would say about the flaw in Sonia's calculation.

(1) Suppose first we understand 'x' as referring fixedly to the actual amount that is in the envelope Sonia (or Fred) is now holding. Then it is NOT automatically true for Sonia (as it is for Fred) that there is a 50-50 chance that she will double or halve THAT amount. (That depends on the probabilistic pattern governing the placing of the amounts in the two envelopes initially presented to Sonia. So, for instance, if the envelope she’s given actually contains a very large amount, towards the upper end of the range of possible amounts, then it's more likely she has 'big' and will lose by swapping--and conversely if her envelope contains an amount towards the bottom end of the range of possible sums.)

(2) What about the TRUTH that Sonia has a 50-50 chance of winning or losing? Well, that's true enough, but we can't plug those 50-50 odds into her calculation. Think of it like this. Her calculation says there are two 50-50 possibilities--she has the big envelope OR she has the small envelope. And the calculation tries to say that in the first possibility swapping will lose 0.5x and in the second swapping will gain x. But 'x' doesn't refer to the SAME number in each of these possibilities. In the first it refers to the amount in the big envelope, in the second it refers to the amount in the small envelope. No wonder this spurious calculation makes swapping seem attractive—it implicitly supposes that the sum in your envelope when you lose will be the SAME as when you win, when in truth it will be twice as big. (Note that for Fred it IS the same amount in his envelope in the two possibilities that he has 'big' and he has 'small'--that's why it is OK for him to do Sonia's calculation using the 50-50 odds.)

(There’s nothing original in the above—the literature says all this and lots more.)

David Papineau

Labels: David Papineau, Hugh McCormack, probability., truth, two envelope paradox

## 1 Comments:

JONNY

The difference in reference is slippery. Lets call it "R" for rigid, when it refers to the actual amount of money in the first envelope chosen. So if you had picked the other envelope first, you would have not picked R. It is an amount of money, that is fixed by a description but is the same amount of money in other hypothetical situations where the description does not fit.

Lets call the other term "N". N refers to the amount in the first envelope whatever it is. So in the counterfactual case where you had picked the other envelope, you will still have picked "N". So when you open the first envelope you learn something, and that is, that in the actual world, R = N.

Now consider this pair of conditionals.

1. If R is the lowest amount, then I will gain R on swapping.

2. If I had picked the lowest amount first, i.e. if N is the lowest amount, then I will gain N on swapping.

With careful consideration it will be evident that the kind of probability arguments for the 50% probability claim are about the second claim. There is a 50% chance that N = the lowest amount. But the utility claim requires there to be a 50% chance that R = lowest amount. Here's why:

Suppose R is the highest amount. Also, in the actual world N = R. Had I picked the other envelope first I would have gained 1/2R on swapping but had I picked the other envelope first I would have gained N on swapping. When considering swapping we think in terms of N, but when keeping in terms of R.

Post a Comment

<< Home