Two envelope paradox solved. Jonny Blamey 22.6.2007
ALF:
Here is a well known paradox for decision theory. I offer you a choice of two envelopes. Let’s call them A and B. A contains twice as much money as B, but we don’t know how much money is in either. It is not possible to tell by looking which envelope is A and which envelope is B. You select one of the two envelopes. Let us call this envelope E1. We still don’t know whether E1 is A or B, but we can safely assume that it is equally likely to be either. So the probability that E1 is A = the probability that E1 is B = ½.
Now I offer you a choice. You can either keep E1 and take the money inside, let us call this option KEEP. Or you can exchange E1 for E2 and take the money in E2. Let us call this SWAP. Which do you choose and why? Remember, you want to end up with envelope A since it has twice as much money in it as envelope B.
BETH
Frankly I am indifferent. I therefore choose KEEP since it requires less effort. I am indifferent because I was initially equally likely to have chosen A as B, and by swapping I am just as likely to end up with A as B. We can express this by multiplying the expected utilities by their probabilities. For KEEP we get whatever value E1 contains, which is either the higher sum A, or the lower sum B. We multiply A by the probability that E1 contains A, and multiply B by the probability that E1 contains B. For SWAP we get whatever is contained in E2. so we multiply the probability that E1 contains A by B and the probability that E1 contains B by A.
PAYOFF KEEP
P(E1 =A)(A) = ½ A = B
+
P (E1 = B)(B) = ½ B = ¼ A
=
1 ½ B = ¾ A
PAYOFF SWAP
P(E1 =A)(B) = ½ A = B
+
P (E1 = B)(A) = ½ B = ¼ A
=
1 ½ B = ¾ A
In fact, probability notions aren’t even necessary since SWAP is no different from choosing E2 in the first place, and this option was open to me. SWAP is just a dithering form of KEEP.
ALF
Ok, I agree, it seems like INDIFFERENCE is the most rational attitude to have towards the choice between SWAP and KEEP. But suppose you were allowed to open E1 before deciding whether to KEEP or SWAP. E1 contains had a specific amount of money in it, let’s call it x. Now we can deduce that E2 must either contain either 2x or ½ x.
BETH
That’s right, because if E1 were A, then E2 would contain ½ x, whereas if E1 were B then E2 would contain 2x and that exhausts the possibilities.
ALF
In fact we don’t even have to open E1 since the same is true for all values x.
BETH
I suppose so.
ALF
If I elect to KEEP, I will end up with x for certain, but if I SWAP I will end up with either 2x or 1/2 x. Therefore it is rational to SWAP since I’ve ½ chance of ending up with ½ x and a ½ chance of ending up with 2x, making the expected utility of SWAP 1¼x. Therefore I should SWAP, and what is more I should pay up to ¼ x to SWAP.
BETH
That’s absurd since then you should also pay up to ¼ x to swap back on the same reasoning. It is absurd because you end up with an intransitive preference. You have reasoned that you should prefer E1 to E2 and E2 to E1.Your mistake is in assigning a single value to x. E1 contains a sum of money and you have elected to call that sum x. If we call the higher sum “A” and the lower sum “B”, then p(x = A) = ½ and p(x = B) = ½ . So we can calculate the utility of KEEP as being ½ A + ½ B, which is ¾ A or 1 ½ B. We don’t know which x is, but we know that x is either A or B. So this gives us 2 possibilities.
1. p (E1 = A and x = A). = ½
The Pay off for KEEP is (A = ½ B = x)
The Pay off for SWAP (½ A= B = 1/2x)
2. p (E1 = B and x =B). = ½
The Pay off for KEEP is (½ A= B = x)
The Pay off for SWAP (A = ½ B = 2x)
So the total Pay off for KEEP is:
½ (A = ½ B) + ½ ( ½ A = B ) = (3/4 A = 1 ½ B )
If x = A, the pay off is ¾ x; if x = B the pay off is 1 ½ x
And the total pay off for SWAP is:
½ (½ A= B ) + ½ (A = ½ B ) = (3/4 A = 1 ½ B)
If x = A, the pay off is ¾ x; if x = B the pay off is 1 ½ x
As you can see, if you are clear about the value of x and the probability that x has that value, the expected utility comes out the same whether you SWAP or KEEP.
ALF
Very clever, but suppose I actually open E1 and count the cash inside. Suppose it comes out at for example £12. I reason that if I KEEP I get £12; whereas if I SWAP, I get either £6 or £24. Since I was just as likely to have chosen A or B, then the expected utility of SWAP is £3 + £12 = £15. Here I am being completely clear that x = £12 and that the probability that x has this value is 1 since I know it to have this value having opened the envelope. Furthermore, the same reasoning applies however much money is in E1. I should always SWAP.
BETH
Your mistake here is in assuming that prob (E2 contains £6) = prob (E2 contains £24) = ½. Why do you assume that? The correct assignments are prob (E2 contains £6) = 2/3 and prob (E2 contains £24) = 1/3.
ALF
I assume that prob (E2 contains £6) = prob (E2 contains £24) = ½ because it is obvious. It is obvious to everyone who has written about this paradox and it is obvious to me. If E1 contained £12 and E1 contained the lesser sum then E2 contains £24. If E1 contained the greater sum then E2 contains £6. There is a equal chance that E1 contained the greater sum (A) or the lesser sum (B), so there is an equal chance that E2 contains £24 or £6.
BETH
The reasoning is seductive, but it you refer back to the utility calculations there is only a chance of getting 2x when x = B and there is only a chance of getting ½ x when x = A. So it is not clear that you reasoning is valid. However it is difficult to explain why your reasoning is wrong, so instead I will demonstrate why the probability that E2 contains 2x is 1/3 and the probability that E2 contains ½ x is 2/3 on the assumption that E1 contains x.
Frank Ramsey developed a measurement for a subjects degree of belief that p given indifference between the options
1. A for certain
2. B if p and C if ~p.
In these conditions the subjects degree of belief that p is equal to
(A – C)/(B – C)
This quantity can be shown to be a probability in that it should conform to the axioms of probability calculus if the subject doesn’t want to be victim to a Dutch book.
This fits rather well with the two envelope paradox. Let proposition p1 be that E2 contains twice the sum of money in E1. Let’s call this sum of money “x”.
So the two options open to us in the envelope problem are
KEEP: x for certain
SWAP: 2x if p1 and ½ x if not p1.
A subject who is indifferent between these options has a degree of belief in p1 equal to:
(x – ½ x)/(2x – ½ x)
= 1/3
Let p2 be that E2 contains half the sum of money in E1.
KEEP: x for certain
SWAP ½x if p2 and 2x if not p2
A subject who is indifferent between these options has a degree of belief in p2 equal to:
(x – 2x)/(½ x – 2x)
= 2/3
So a subject who is indifferent between the options KEEP and SWAP has a degree of belief 1/3 that E2 contains twice the amount in E1 and degree of belief 2/3 that E2 contains half the amount in E1.
We started off agreeing that we should be indifferent between the options KEEP and SWAP. If we should be indifferent, then our degree of belief should be what the formula says it is when we are indifferent. So our degree of belief that E2 contains twice the amount in E1 should be 1/3 and our degree of belief that E2 contains half the amount in E1 is 2/3.
ALF: That is absurd and I can tell you why. If the reasoning was valid then it would apply equally to E2. This would make the probability that E2 contains twice the amount in E1 = 1/3 and the probability that E1 contains twice the amount in E2 also 1/3. But this exhausts the possibilities so the numbers should add up to 1. Worse still the probability the E2 contains half amount in E1 = 2/3, but so does the probability that E1 contains half the amount in E2, that means that according to your reasoning either the disjunction has a probability higher than 1, or there is a probability of at least 1/3 that both amounts are lower than the other. And the worst subjectivist crime of all, your degree of belief not only does, but should change according to how the case is described. For you have only a 1/3 degree of belief that E2 has half the amount in E1 but 2/3 degree of belief that E1 has twice the amount in E2. But surely this is the same state of affairs.
BETH: Go back and read your Kripke. “The amount in E1” can be a rigid or non-rigid designator, as can “The amount in E2”. Let us rigidly designate the amount in E1 as x and the amount in E2 as y. For non rigid designation we will use E1 and E2. If we rigidly designate (or simply find out) the value of x, but we don’t know the amount in E2, then prob (E2 = 2x) = 1/3 and prob (E2 = ½ x) = 2/3. On the other hand if we rigidly designate (or know) the value of y but don’t know the value of E1, then prob (E1= 2y) = 1/3 and prob (E2 = 1/2y) = 2/3.
To picture this suppose E1 = £4 and E2 = £2. Let call the sum of E2 and E1: T for total. A for the lesser sum and B for the greater sum
Possible values for E2 and T given E1 = 4 = x
1.
2. E2 = 1/2x = B
3.
4. E1 = x = A or B
5.
6. T = 1 ½ x = A + ½ B
7.
8. E2 = 2x = A
9.
10.
11.
12. T = 3x = A + B.
Possible values for E1 and T given E2 = 2 =y
1. E1 = ½ y = B
2. E2 = y = A or B
3. T = 1 ½ y = A + B
4. E1 = 2y = A
5.
6. T = 3x = A + B
So when I say that the probability that the amount in E2 is half the amount in E1 = 2/3 I mean the probability that E2 = 2 = 2/3. This is because “the amount in E1” rigidly designates x or 4. So the probability that E1 = 4 is 1. But when I say the probability that the amount in E1 is twice the amount in E2 I mean the probability that (E1 = 4) = 1/3. This time it is “the amount in E2” which is the rigid designator. When a term is rigidly designated it is assumed to have a probability 1.
Incidentally, don’t assume that we are talking about Bayesian or Kolmogorov conditionalization here. The prior or unconditional probabilities of E2 = 2 and E1 = 4 are either zero or undefined. Here is how Kolmogorov defines conditional probabilities
P(E2=2 E1=4)
=
P(E2=2 ∩E1=4)/ P(E1=4)
provided P(b) > 0.
But I take it that we don’t know any of the unconditional probabilities.
ALF:
Perhaps we could think of the total amount of money in both envelopes as defining the logical space of probability. A probability can be expressed as a proper fraction. If we take the total amount in both envelopes to be the denominator and the amount in each envelope to be numerators we get the result that envelope A contains 2/3 of the total and envelope B contains 1/3 of the total. When we open E1 we discover that there is £x inside. What we are interested in is whether we have 1/3 of the total or 2/3 of the total. If we have 1/3 of the total then E2 contains 2x and the denominator is 3x. If we have 2/3 of the total then E2 contains ½ x and the denominator is 1½ x.
In effect we have a 1/3 share of the 2x space of probability and a 2/3 share of the ½ x space of probability.
BETH:
Yes, here’s how I look at it. When we open E1 and find a sum of money in there, (call it x) we know that the total possible amount of money is 3x. Given 3x, we can make a pair of envelopes with x and 2x respectively; or we can make 2 pairs of envelopes with x and ½ x. Therefore given that E1 contains x, the probability that E2 contains ½ x is twice that of the probability that E2 contains 2x. Given that this exhausts the possibilities the probability that E2 contains 2x is 1/3 and the probability that E1 contains ½ x is 2/3.
ALF:
So contra Casper J. Albers, Barteld P. Kooi and Willem Schaafsma in Synthese 2005 145: 89–109, we can, and have, resolved the two envelope problem. To sum up: Indifference between KEEP and SWAP is the rational attitude. The correct degree of belief to have that the second envelope contains twice the amount contained in the first envelope is 1/3 and the correct degree of belief that the second envelope contains half the amount in the first envelope is 2/3. Therefore the expected utility of both SWAP and KEEP are the same. An explanation for the counter intuitive probability assignments is that 3x = 2(1 ½ x) so the lower pair of envelopes is twice as likely as the higher pair. Perhaps in slogan form: for every pair of socks there are two odd socks.
BETH That’ll explain why you wear odd socks half the time then Alf, or should that be 2/3s of the time!
Here is a well known paradox for decision theory. I offer you a choice of two envelopes. Let’s call them A and B. A contains twice as much money as B, but we don’t know how much money is in either. It is not possible to tell by looking which envelope is A and which envelope is B. You select one of the two envelopes. Let us call this envelope E1. We still don’t know whether E1 is A or B, but we can safely assume that it is equally likely to be either. So the probability that E1 is A = the probability that E1 is B = ½.
Now I offer you a choice. You can either keep E1 and take the money inside, let us call this option KEEP. Or you can exchange E1 for E2 and take the money in E2. Let us call this SWAP. Which do you choose and why? Remember, you want to end up with envelope A since it has twice as much money in it as envelope B.
BETH
Frankly I am indifferent. I therefore choose KEEP since it requires less effort. I am indifferent because I was initially equally likely to have chosen A as B, and by swapping I am just as likely to end up with A as B. We can express this by multiplying the expected utilities by their probabilities. For KEEP we get whatever value E1 contains, which is either the higher sum A, or the lower sum B. We multiply A by the probability that E1 contains A, and multiply B by the probability that E1 contains B. For SWAP we get whatever is contained in E2. so we multiply the probability that E1 contains A by B and the probability that E1 contains B by A.
PAYOFF KEEP
P(E1 =A)(A) = ½ A = B
+
P (E1 = B)(B) = ½ B = ¼ A
=
1 ½ B = ¾ A
PAYOFF SWAP
P(E1 =A)(B) = ½ A = B
+
P (E1 = B)(A) = ½ B = ¼ A
=
1 ½ B = ¾ A
In fact, probability notions aren’t even necessary since SWAP is no different from choosing E2 in the first place, and this option was open to me. SWAP is just a dithering form of KEEP.
ALF
Ok, I agree, it seems like INDIFFERENCE is the most rational attitude to have towards the choice between SWAP and KEEP. But suppose you were allowed to open E1 before deciding whether to KEEP or SWAP. E1 contains had a specific amount of money in it, let’s call it x. Now we can deduce that E2 must either contain either 2x or ½ x.
BETH
That’s right, because if E1 were A, then E2 would contain ½ x, whereas if E1 were B then E2 would contain 2x and that exhausts the possibilities.
ALF
In fact we don’t even have to open E1 since the same is true for all values x.
BETH
I suppose so.
ALF
If I elect to KEEP, I will end up with x for certain, but if I SWAP I will end up with either 2x or 1/2 x. Therefore it is rational to SWAP since I’ve ½ chance of ending up with ½ x and a ½ chance of ending up with 2x, making the expected utility of SWAP 1¼x. Therefore I should SWAP, and what is more I should pay up to ¼ x to SWAP.
BETH
That’s absurd since then you should also pay up to ¼ x to swap back on the same reasoning. It is absurd because you end up with an intransitive preference. You have reasoned that you should prefer E1 to E2 and E2 to E1.Your mistake is in assigning a single value to x. E1 contains a sum of money and you have elected to call that sum x. If we call the higher sum “A” and the lower sum “B”, then p(x = A) = ½ and p(x = B) = ½ . So we can calculate the utility of KEEP as being ½ A + ½ B, which is ¾ A or 1 ½ B. We don’t know which x is, but we know that x is either A or B. So this gives us 2 possibilities.
1. p (E1 = A and x = A). = ½
The Pay off for KEEP is (A = ½ B = x)
The Pay off for SWAP (½ A= B = 1/2x)
2. p (E1 = B and x =B). = ½
The Pay off for KEEP is (½ A= B = x)
The Pay off for SWAP (A = ½ B = 2x)
So the total Pay off for KEEP is:
½ (A = ½ B) + ½ ( ½ A = B ) = (3/4 A = 1 ½ B )
If x = A, the pay off is ¾ x; if x = B the pay off is 1 ½ x
And the total pay off for SWAP is:
½ (½ A= B ) + ½ (A = ½ B ) = (3/4 A = 1 ½ B)
If x = A, the pay off is ¾ x; if x = B the pay off is 1 ½ x
As you can see, if you are clear about the value of x and the probability that x has that value, the expected utility comes out the same whether you SWAP or KEEP.
ALF
Very clever, but suppose I actually open E1 and count the cash inside. Suppose it comes out at for example £12. I reason that if I KEEP I get £12; whereas if I SWAP, I get either £6 or £24. Since I was just as likely to have chosen A or B, then the expected utility of SWAP is £3 + £12 = £15. Here I am being completely clear that x = £12 and that the probability that x has this value is 1 since I know it to have this value having opened the envelope. Furthermore, the same reasoning applies however much money is in E1. I should always SWAP.
BETH
Your mistake here is in assuming that prob (E2 contains £6) = prob (E2 contains £24) = ½. Why do you assume that? The correct assignments are prob (E2 contains £6) = 2/3 and prob (E2 contains £24) = 1/3.
ALF
I assume that prob (E2 contains £6) = prob (E2 contains £24) = ½ because it is obvious. It is obvious to everyone who has written about this paradox and it is obvious to me. If E1 contained £12 and E1 contained the lesser sum then E2 contains £24. If E1 contained the greater sum then E2 contains £6. There is a equal chance that E1 contained the greater sum (A) or the lesser sum (B), so there is an equal chance that E2 contains £24 or £6.
BETH
The reasoning is seductive, but it you refer back to the utility calculations there is only a chance of getting 2x when x = B and there is only a chance of getting ½ x when x = A. So it is not clear that you reasoning is valid. However it is difficult to explain why your reasoning is wrong, so instead I will demonstrate why the probability that E2 contains 2x is 1/3 and the probability that E2 contains ½ x is 2/3 on the assumption that E1 contains x.
Frank Ramsey developed a measurement for a subjects degree of belief that p given indifference between the options
1. A for certain
2. B if p and C if ~p.
In these conditions the subjects degree of belief that p is equal to
(A – C)/(B – C)
This quantity can be shown to be a probability in that it should conform to the axioms of probability calculus if the subject doesn’t want to be victim to a Dutch book.
This fits rather well with the two envelope paradox. Let proposition p1 be that E2 contains twice the sum of money in E1. Let’s call this sum of money “x”.
So the two options open to us in the envelope problem are
KEEP: x for certain
SWAP: 2x if p1 and ½ x if not p1.
A subject who is indifferent between these options has a degree of belief in p1 equal to:
(x – ½ x)/(2x – ½ x)
= 1/3
Let p2 be that E2 contains half the sum of money in E1.
KEEP: x for certain
SWAP ½x if p2 and 2x if not p2
A subject who is indifferent between these options has a degree of belief in p2 equal to:
(x – 2x)/(½ x – 2x)
= 2/3
So a subject who is indifferent between the options KEEP and SWAP has a degree of belief 1/3 that E2 contains twice the amount in E1 and degree of belief 2/3 that E2 contains half the amount in E1.
We started off agreeing that we should be indifferent between the options KEEP and SWAP. If we should be indifferent, then our degree of belief should be what the formula says it is when we are indifferent. So our degree of belief that E2 contains twice the amount in E1 should be 1/3 and our degree of belief that E2 contains half the amount in E1 is 2/3.
ALF: That is absurd and I can tell you why. If the reasoning was valid then it would apply equally to E2. This would make the probability that E2 contains twice the amount in E1 = 1/3 and the probability that E1 contains twice the amount in E2 also 1/3. But this exhausts the possibilities so the numbers should add up to 1. Worse still the probability the E2 contains half amount in E1 = 2/3, but so does the probability that E1 contains half the amount in E2, that means that according to your reasoning either the disjunction has a probability higher than 1, or there is a probability of at least 1/3 that both amounts are lower than the other. And the worst subjectivist crime of all, your degree of belief not only does, but should change according to how the case is described. For you have only a 1/3 degree of belief that E2 has half the amount in E1 but 2/3 degree of belief that E1 has twice the amount in E2. But surely this is the same state of affairs.
BETH: Go back and read your Kripke. “The amount in E1” can be a rigid or non-rigid designator, as can “The amount in E2”. Let us rigidly designate the amount in E1 as x and the amount in E2 as y. For non rigid designation we will use E1 and E2. If we rigidly designate (or simply find out) the value of x, but we don’t know the amount in E2, then prob (E2 = 2x) = 1/3 and prob (E2 = ½ x) = 2/3. On the other hand if we rigidly designate (or know) the value of y but don’t know the value of E1, then prob (E1= 2y) = 1/3 and prob (E2 = 1/2y) = 2/3.
To picture this suppose E1 = £4 and E2 = £2. Let call the sum of E2 and E1: T for total. A for the lesser sum and B for the greater sum
Possible values for E2 and T given E1 = 4 = x
1.
2. E2 = 1/2x = B
3.
4. E1 = x = A or B
5.
6. T = 1 ½ x = A + ½ B
7.
8. E2 = 2x = A
9.
10.
11.
12. T = 3x = A + B.
Possible values for E1 and T given E2 = 2 =y
1. E1 = ½ y = B
2. E2 = y = A or B
3. T = 1 ½ y = A + B
4. E1 = 2y = A
5.
6. T = 3x = A + B
So when I say that the probability that the amount in E2 is half the amount in E1 = 2/3 I mean the probability that E2 = 2 = 2/3. This is because “the amount in E1” rigidly designates x or 4. So the probability that E1 = 4 is 1. But when I say the probability that the amount in E1 is twice the amount in E2 I mean the probability that (E1 = 4) = 1/3. This time it is “the amount in E2” which is the rigid designator. When a term is rigidly designated it is assumed to have a probability 1.
Incidentally, don’t assume that we are talking about Bayesian or Kolmogorov conditionalization here. The prior or unconditional probabilities of E2 = 2 and E1 = 4 are either zero or undefined. Here is how Kolmogorov defines conditional probabilities
P(E2=2 E1=4)
=
P(E2=2 ∩E1=4)/ P(E1=4)
provided P(b) > 0.
But I take it that we don’t know any of the unconditional probabilities.
ALF:
Perhaps we could think of the total amount of money in both envelopes as defining the logical space of probability. A probability can be expressed as a proper fraction. If we take the total amount in both envelopes to be the denominator and the amount in each envelope to be numerators we get the result that envelope A contains 2/3 of the total and envelope B contains 1/3 of the total. When we open E1 we discover that there is £x inside. What we are interested in is whether we have 1/3 of the total or 2/3 of the total. If we have 1/3 of the total then E2 contains 2x and the denominator is 3x. If we have 2/3 of the total then E2 contains ½ x and the denominator is 1½ x.
In effect we have a 1/3 share of the 2x space of probability and a 2/3 share of the ½ x space of probability.
BETH:
Yes, here’s how I look at it. When we open E1 and find a sum of money in there, (call it x) we know that the total possible amount of money is 3x. Given 3x, we can make a pair of envelopes with x and 2x respectively; or we can make 2 pairs of envelopes with x and ½ x. Therefore given that E1 contains x, the probability that E2 contains ½ x is twice that of the probability that E2 contains 2x. Given that this exhausts the possibilities the probability that E2 contains 2x is 1/3 and the probability that E1 contains ½ x is 2/3.
ALF:
So contra Casper J. Albers, Barteld P. Kooi and Willem Schaafsma in Synthese 2005 145: 89–109, we can, and have, resolved the two envelope problem. To sum up: Indifference between KEEP and SWAP is the rational attitude. The correct degree of belief to have that the second envelope contains twice the amount contained in the first envelope is 1/3 and the correct degree of belief that the second envelope contains half the amount in the first envelope is 2/3. Therefore the expected utility of both SWAP and KEEP are the same. An explanation for the counter intuitive probability assignments is that 3x = 2(1 ½ x) so the lower pair of envelopes is twice as likely as the higher pair. Perhaps in slogan form: for every pair of socks there are two odd socks.
BETH That’ll explain why you wear odd socks half the time then Alf, or should that be 2/3s of the time!
Labels: Epistemology probability two envelope paradox Ramsey Kripke Rigid designators
109 Comments:
"
Incidentally, don’t assume that we are talking about Bayesian or Kolmogorov conditionalization here. The prior or unconditional probabilities of E2 = 2 and E1 = 4 are either zero or undefined."
??????
If you're not using Kolmogorov conditionalization then you're violating the three fundamental axioms of probability: Kolmogorov's axioms!
Therefore your probability distribution is invalid!
geez, for a minute there i was excited.
what a let down. (;-/
I did promise that the solution involves a new way of looking at probability. The problem with the two envelope paradox is that the prior probability that either envelope contains any particular amount is 0. So using Kolmogorov's conditionalization makes any conditional probability undefined. But kolomogorov's axioms are contraints, so where the axioms leave something undefined shouldn't be thought of as ruling out the possibility that we can derive a conditional probability through some other means.
What I am claiming here is that the rational degree of belief to have that E2 contains 2x given that E1 contains x is 1/3.
As a conditional this can be written:
P(E2=2x)|(E1=x) = 1/3
Kolmogorov's conditionalization just identifies this with
P(E2=2x| (E1=x)
=
P(E2=2x ∩E1=x)/ P(E1=x)
So long as P(E1 = x) > 0.
But in the two envelope case the prior probability of P(E1 = x) is 0, or we just don't know what it is. The same is true of the unconditional probability that E2 contains 2x and E1 contains x. This probability is smaller than P(E1 = x), but that probability is already 0 or just simply unknown.
So there is no conflict with Kolgomorov, just no help from him. This shouldn't be suprising. Inductive logic is in its infancy, whereas making rational decisions under uncertainty is as old as humanity. So to say that there is nothing to be said about reasoning under uncertainty outside of Kolmogorov's axioms is not only narrow minded but false. Most of our judgements concerning probabilities are made on the basis of evidence where we have no conception of what an unconditional or prior probability could be. Suppose that 90% of observed Snarks have brollos. What is the probability conditional on this evidence that the next Snark has a brollo? Well, what is the unconditional or prior probability that 90% of observed Snarks have Brollos? Absolutley no idea. Does this mean we can't assume with a degree of belief of 0.9 that the next Snark will have a Brollo? No. It is the most rational degree of belief to have. But this piece of perfectly reasonable thinking, common place in science and everyday thought has nothing to do with Bayesian or Kolgomorov conditionalisation.
Jonny
Ok, let’s go through the argument in full for the benefit of other readers who may not be entirely familiar with the original problem first and then we’ll re-dive into the problem and the proposed solutions offered by others [referred to as the traditional views], then Jonny, then finally my own view on the matter.
THE PROBLEM:
You are given a choice between two envelopes, labelled ‘A’ and ‘B’.
You are told that each envelope contains some money and that one envelope contains twice as much money as the other.
Stated more formally, we have two envelopes, ‘A’ and ‘B’.
Each envelope contains ‘x’ amount of money in, and one envelope contains twice the amount of money as the other one.
E.g. If envelope ‘A’ is the one that contains twice the amount of money as envelope ‘B’ then the distribution of money is as follows:
‘A’ contains x and ‘B’ contains 1/x.
If envelope ‘B’ contains twice the amount of money as ‘A’ then the distribution of money is:
‘A’ contains 1/x and ‘B’ contains x.
So far so good.
Now, you don’t know which envelope [‘A’ or ‘B’] contains the higher amount and which has the lower amount.
So, you choose one envelope at random, but are given the opportunity to switch to the other before making your final choice and find out which envelope contained the larger quantity of money [viz., 1/x or x].
According to standard Decision Theory, it is more rational to switch envelopes after making your initial choice as follows:
Let x be the sum quantity of money in your envelope.
This quantity of money will be either 1/x or x, and ipso facto, the quantity in the other is either 1/x or x, and these possibilities are equally likely. So the expected utility of switching is 1/2(1/2x) + 1/2(2x) = 1.25x, whereas that for sticking is only x. So it is rationally preferable to switch.
Now we reach the problem:
If you decided to switch envelopes on the basis of the reasoning above, then the same argument could immediately be given for switching back; and so on, ad nauseum. For another, there is a parallel argument for the rational preferability of sticking, in terms of the quantity y in the other envelope. But the problem is to provide an adequate account of how the argument goes wrong.
This is the crux of the two-envelope paradox.
Now, here is my reply.
Looking at this from a ‘man on the street’ perspective, everything seems relatively clear:
We are faced with two envelopes, ‘A’ and ‘B’.
We must choose one of them.
The logical probability between choosing one is 0.5 for each, as there are two possible outcomes: we either choose one or the other.
Now, the amount of money contained in the envelope is not known to us, so our epistemic probability weightings are not affected, as we ignorant of which envelope, ‘A’ or ‘B’, contains twice the amount of the other.
It is at this point that we need to make a distinction between the expected utility of choosing one envelope over another, and the probability of choosing one envelope rather than another.
The expected utility of choosing one over the other is indeed skewed, as the decision matrices demonstrate.
But as we do not know which envelope will maximize our utility [viz., by choosing the envelope with the greater amount of money in], we cannot apply our utility to our choice of the envelope until it has been opened!
So, we are equally rational in choosing envelope ‘A’ rather than envelope ‘B’, and vice versa, viz., we are also equally rational in choosing envelope ‘B’ over envelope ‘A’, for the simple reason that each outcome is equiprobable due to our epistemic probabilities being unknown [i.e. we don’t know which envelope contains the larger quality of money.
This fits in with a Bayesian analysis of the situation: our prior probabilities are 0.5 for each: P(A)=0.5 and P(B)=0.5.
Our posterior probabilities will be different after we have opened the envelope and revealed which one contained the larger sum of money.
So, to summarize:
The choice between envelope ‘A’ and ‘B’ is equiprobable given our ignorance regarding which envelope contains the greater sum of money.
This equiprobable choice between ‘A’ and ‘B’ makes us equally rational in choosing either envelope.
The expected utility is zero and our Bayesian priors are 0.5 for A and B respectively, until we have opened the envelope.
Only then do we have knowledge of which envelope contained the greater amount of money…
mmm, i disagree with your comments re. kolmogorov and bayes.
firstly, re. your comment on bayesianism:
the prior probabilities are not zero, they are 0.5 each!
For envelopes 'A' and 'B' respectively we have:
Pr(A) = 0.5 and the Pr(B)=0.5
Bayesianism starts with the logical probabilities as the priors and then updates them in the posterior probabilities in lieu of new information!
Secondly, Kolmogorov's axioms are not constraints! they are there to formalize probability theory and ensure that no contradictions arise, such as in this case!
e.g to ensure that the sum of probabilities over a given set does not exceed 1.0 or less than 0.0 !
Deary me.
You write:
"So to say that there is nothing to be said about reasoning under uncertainty outside of Kolmogorov's axioms is not only narrow minded but false. Most of our judgements concerning probabilities are made on the basis of evidence where we have no conception of what an unconditional or prior probability could be."
This is fine for everyday judgements, but not here!
Why?
Because we're formalizing the probabilities and possible outcomes with probability weightings and expected utility!
You cannot simply make things up thinking they're right without a rigorous formalized axiomatic system to back them up!
Your comment about the shark and the brillo demonstrates this:
IF there is no evidence for something or it is undefined then you are simply not justified or in any way legitimated in assigning a probability to it!
If you do this then you are merely assigning a subjective probability to it, and this simply isn't what decision or game theory is about at all!
First, the version of the problem I am addressing is where one envelope contains twice the sum of the other. I am calling the lower amount "B" and the higher amount "A". So if A = x then B = 1/2x; whereas if B = x then A = 2x. (One of the comments confused me by saying if A contains x then B contains 1/x; which is just a different problem, but I think this may have been a typo).
The prior probability that the envelope you have chosen contains A (the higher amount) is 0.5 as you say. But A doesn't refer to a number, it refers to a ratio with the amount in the other envelope. "X" refers to a number, and the probability that the envelope you chose first contains x is 0 before you open the envelope because the amount could be anything in the infinite number series. For example:
Question 1: before you open the first envelope what is the probability that it contains the larger amount? (twice the amount as the second envelope)
Answer: ½.
QU 2: What is the probability that it contains £10?
Answer 0.
QU 3: What is the probability that the first envelope contains twice the amount in the second envelope?
Answer: That depends on whether the “amount in the second envelope” is a rigid designator. If you mean by twice "the amount in the second envelope" the amount, , like £5, that is actually in the second envelope, multiplied by 2, then the answer is 0. If you mean simply twice the amount of the other envelope whatever it is. Then the answer is ½.
Secondly you say that subjective probability is nothing to do with decision theory or game theory. I think this is a misunderstanding. Decision theory tells us what to do given that we have particular degrees of belief, which we can call subjective probability assignments. If we have a degree of belief 1/2 that E2 contains 2x and a degree of belief 1/2 that E2 contains 1/2 x given that E1 contains x, then we should swap. But we shouldn't swap, so these subjective probability assignments are wrong. It should be 1/3 and 2/3s for the reasons I have given. These have to be subjective probabilities; but not any old ones, they have to be the right ones. Since we both appear to agree that indifference is the correct attitude, then the right (correct, true, objective) probability assignments are 1/3 to 2x and 2/3 to ½ x.
Nothing you have said touches this argument.
First, the version of the problem I am addressing is where one envelope contains twice the sum of the other. I am calling the lower amount "B" and the higher amount "A". So if A = x then B = 1/2x; whereas if B = x then A = 2x. (One of the comments confused me by saying if A contains x then B contains 1/x; which is just a different problem, but I think this may have been a typo).
The prior probability that the envelope you have chosen contains A (the higher amount) is 0.5 as you say. But A doesn't refer to a number, it refers to a ratio with the amount in the other envelope. "X" refers to a number, and the probability that the envelope you chose first contains x is 0 before you open the envelope because the amount could be anything in the infinite number series.
This is incorrect:
1) the probability values for envelopes A and B are Pr(A)=0.5 and Pr=(B)=0.5 is simply the likelihood of the agent selecting the envelope with the larger quantity of money [viz., twice the amount of the other envelope].
As there are two envelopes, there is a 50/50 [0.5] chance of selecting the larger one.
2) My reference to the values being 1/x and x is identical to yours: any infant will see that x and 2x is mathematically identical to x and 1/x, where x is twice the amount of 1/x.
Bayesian is about rationality: if we have no knowledge pertaining to which envelope is more or less likely to contain twice the sum of money as the other one, it follows that we are equally rational in selecting one envelope rather than another.
The Bayesian priors are therefore 0.5 and 0.5.
3) The amount, as you put it, could not be anything in the infinite number series [viz., any rational number], for the simple reason that, as you have already formulated the problem, there exist only one of two possible values: x or 2x!
[N.b. The exclamation mark is for exclamative purposes and does not represent a factorial].
For example:
Question 1: before you open the first envelope what is the probability that it contains the larger amount? (twice the amount as the second envelope)
Answer: ½.
QU 2: What is the probability that it contains £10?
Answer 0.
QU 3: What is the probability that the first envelope contains twice the amount in the second envelope?
Answer: That depends on whether the “amount in the second envelope” is a rigid designator. If you mean by twice "the amount in the second envelope" the amount, , like £5, that is actually in the second envelope, multiplied by 2, then the answer is 0. If you mean simply twice the amount of the other envelope whatever it is. Then the answer is ½.
Again, this is false.
Q1 above, as you have formulated it, is valid for reasons given at the beginning of this post.
Re. Q2: This is irrelevant. The envelope problem is formulated algebraically, it does not refer to explicitly to concrete values, as it is merely concerned, as you’ve formulated it, with one envelope containing ‘x’ amount of money and the other envelope containing twice that amount, viz., ‘2x’.
So your comment about £10 is irrelevant unless you state that the £10 envelope is the larger one, in which case the other envelope will contain £5 [£10/2 = £5 = 2x/2 = x], and if the £10 envelope is the smaller envelope it follows that the other envelope contains 2*£10 = £20 = 2x].
The notion of a rigid designator is also a highly odd term to use. This is a decision theory problem, familiar to many economists, game theorists, Bayesians and Decision Theorists. The term ‘rigid designator’ is not one that has been used by any of them, for the simple reason that it is irrelevant: we have already fixed the ratio, full stop.
Secondly you say that subjective probability is nothing to do with decision theory or game theory. I think this is a misunderstanding. Decision theory tells us what to do given that we have particular degrees of belief, which we can call subjective probability assignments. If we have a degree of belief 1/2 that E2 contains 2x and a degree of belief 1/2 that E2 contains 1/2 x given that E1 contains x, then we should swap. But we shouldn't swap, so these subjective probability assignments are wrong. It should be 1/3 and 2/3s for the reasons I have given. These have to be subjective probabilities; but not any old ones, they have to be the right ones. Since we both appear to agree that indifference is the correct attitude, then the right (correct, true, objective) probability assignments are 1/3 to 2x and 2/3 to ½ x.
Nothing you have said touches this argument.
Again, this rests upon an misunderstanding of decision theory and Bayesianism.
Bayesianism and decision theory, you are correct, is indeed concerned with the agents degree of rational belief.
However, ones legitimacy in believing something is intrinsically linked with evidential support.
With Bayes Theorem, we have our prior probabilities, which are updated in lieu of new evidence/information.
Initially, we are faced with two envelopes, although the subjective utility of selecting the envelope with the larger amount is greater than selecting the envelope with the smaller amount in, we are not in possession of any information to update our prior probabilities into posterior probabilities, for we have no knowledge pertaining to which envelope contains the greater amount.
As such, we only have two possible choices: envelope A or envelope B.
Nothing you have said touches this argument.
You are quite right, I have not said anything to touch your argument, for the simple reason that I can’t see any argument: just misunderstanding on the nature of probability theory and Bayesianism.
Please relax a bit and slow down.
1. The ratio 2x:x is only the same as the ratio x:1/x if x=2.
A simple thought experiment will prove this. If I open the first envelope and it contains £10, the second envelope will not contain £1/10. If it contains £5 then one envelope contains X and the other 2x, but one envelope contains x and the other does not contain 1/x.
2. I agree, I agree, I said at the start and I have said in every comment that the probability that the envelope you select first (what I have been calling E1) contains the higher amount = 0.5
In fact, this is the first step in my argument. It forms the basis of the important premise that indifference between SWAP and KEEP is the right attitude.
3. What I am ALSO stating, which is a different point, and should be equally undeniable, is that the probability that the first envelope contains x where x stands for any value in the infinite number series is 0. I don't see that you have disagreed with this for any reason.
4. Because of this, there is no prior probability on which to update with the new information. So Bayes theorem is of no use. This could be why there are no successful Bayesian solutions to the paradox. To point out that my solution isn't Bayesian is irrelevant.
5. If you haven't seen the argument, then I'm sorry, I tried my hardest to explain it in the simplest terms I could. Just for you here is a shorter version.
i) Indifference is the best attitude to have between KEEP and SWAP
ii) Hypothesis E1 contains £X.
iii) An agent who was indifferent between KEEP and SWAP would have degree of belief 1/3 that E2 contains £2X and 2/3 that E2 contains £0.5x. (using RAMSEY 1926)
iv) Since we should be indifferent, then we should have these degrees of belief.
v) So the mistaken premise in the argument that leads to the paradox is the premise that the probability that E2 contains £2x is 1/2 and the probability that E2 contains £0.5x is 1/2. This is NOT the same as the premise that the probability that E2 contains the higher amount is equal to the probability that E2 contains the lower amount before the amount is specified. This is because "x" is being used to refer to the amount of money in the first envelope, not as a free variable expressing an aspect of the ratio between the sums in the two envelopes.
I'll explain using a hypothetical case.
Lets suppose E1 contains £4 and E2 contains £2.
A subject picks one envelope and calls the amount in that envelope "x" then he reasons that the other envelope contains 2x or 1/2x. If the other envelope contains 2x, then x = £2, if the other envelope contains 1/2x then x = £4. So although it is true that it is equally likely that E2 contains 2x or 1/2x, this is only true when 2x = 1/2x. We can see now how the reasoning is flawed and have no further reason to hold the hypothesis that p(E2 = 2x) = p(E2 = 1/2x) = 1/2.
The paradox is solved.
Accept, or show where you disagree, or at least stop insulting me.
I've already stated where the fallacies are but without success.
Perhaps it's time you submitted your 'proof' to a reputable journal and await the referees response and post that on the blog.
;)
This is interesting.
Someone else is using the html editing features of the blog.
I would like to point out that the previous anonymous posts aren't by me, 'R'.
Anon's posts below are rather rude and violate an axiom I've expressed in a previous post:
Attack propositions, not people!
If you cannot say something politely then you should say nothing.
Mmm, I preferred it when I was the only one using HTML editing on the blog: it gave my posts a unique recognizable appearance that distinguished them from other peoples.
Anon: could you at least use a pseudonym name/initial to demarcate yourself from my posts....?
Surely a little letter wouldn't be too much trouble...?
P.s. For the record, I also disagree with Jonny's attempted proof, but for slightly different reasons to 'Anon'.
I feel that Jonny hasn't helped himself in being fully conveying his argument due to the length of his original post [said Mr Pot to Mrs Kettle] (;-).
Perhaps a succinct bulletpoint form outline of your argument would be beneficial to all parties concerned....?
P.s. I do agree that perhaps it is time the article, if you feel it is sufficiently polished, both mathematically and stylistically,
should be submitted for publication...?
I think the BJPS would be ideal for it.
R.
Let's hear your reasons then R, this is getting interesting.
H
Mmm,
The real problem here isn't so much a question of whether or not Jonny's calculations are flawed, it's due to something far more fundamental:
1) the interpretation of the probability values for each choice.
2) the interpretation of the utility function for each choice.
3) the relation between the probability values and the utility values [i.e. the relation between the utility function and the probability function].
This isn't a new problem: Bayesians in particular have been hounded by problems concerning the interpretation of the probabilities has been around for as long as Thomas Bayes theorem first appeared in the Philosophical Transactions Journal for the Royal Society.
The key word here is objectivity.
objectivity is a matter of degree and that while subjectivity
may infect some problems, objective Bayesianism yields a high degree of objectivity. We have been focusing on what we might call epistemic objectivity, the extent to which an agent’s degrees of belief are determined by her background knowledge. In applications of probability a high degree of epistemic objectivity is an important desideratum: disagreements as to probabilities can be attributed to differences in background knowledge; by agreeing on background knowledge consensus can be reached on probabilities.
I am of course referring to the infamous "Problem of the Priors": Bayes theorem enables us to compute/update our prior probabilities in lieu of new evidence to form posterior probabilities: the more evidence you have, the greater then update in your posterior probabilities.
But how does one first decide on assigning a specific probability value to the initial prior probabilities?
Some Bayesians believe that one may choose any value, so long as it is chosen in accord with the Kolmogorov axioms, e.g. the value chosen is between the real interval [0,1], as with the increasing information acquired,the probabilities will gradually converge to a specific value.
But this view of explaining the priors presupposes a frequentist view of Probability, which is not without its problems.
Then there's the propensity theorists, etc, etc, which aren't really relevant at all to this post,but I feel the need to mention them, if only for the sake of thoroughness in addressing the more general problematic framework that probability and decision theory is shrouded in.
In the case of the envelope, this is a one-off event, so strictly speaking, Bayesianism is useless here: utterly useless.
So surely it follows that the only feasible recourse is to the safety of logical probability: there are two choices, envelope A or envelope B. So we have PrA(0.5) and PrB(0.5).
So far so good.
HOWEVER, then we reach the problem mentioned near the beginning of this post: the utility value for each envelope and the relation of the utility function to the probability function.
THIS is where it gets messy, very messy.
I confess that I am uncertain as to what the correct answer is at this point.
Admittedly this is probably due to the fact that it i 2:30am and I am tired. So watch this space... (;-)
R.
Thanks R, that's great. I want to make it clear that I am interested in the objective epistemic probability of events. This is the most rational degree of belief to have given a specified body of knowledge.
Also, it is as you say a one off case so we can only have a frequentist approach if we talk about counterfactuals and possibilities.
I agree about priors too. For any particular x such that either envelope contains x, the prior probability must be 0.
Lets name the envelopes E1 and E2.
Let us say E1 contains £x and E2 contains £y. We can also call the higher amount A and the lower amount B.
Now we have 4 epistemic states (ES)that are relevant:
ES 1. We know neither values x or y.
Prob (x = A) = prob (y = A) = 0.5.
Prob (A = B) = prob (x = y) = 0.
What we don't know from 1 is that prob (X = 2y) = prob (x = y/2) = 0.5. This is the mistaken assumption that drives the paradox.
ES 2. We know the value of x.
(having opened the first envelope)
prob (1/2x = y) = 1/3
prob (2x = y) = 2/3
ES 3. We know the value of y (but not x)
prob (x = 1/2y) = 1/3
prob (x = 2y) = 2/3
ES 4. We know the contents of both envelopes and all probability assignments are 1 or 0. Hypothetically X = A, y =B. X = £10, y = £5.
The paradox arises because the new probabilities in ES2 and ES3 seem to violate probability axioms. This is because it now seems that if you open E1 x becomes more likely to be A, but if you open E2 y seems more likely to be A, whereas the act of opening the envelope doesn't affect the likeliness of any envelope being A. It is still 0.5.
The difficulty can be seen if we imagine a 2 player game where player S2 gets E1 and player S3 gets E2.
According to my probability assignment both S2 and S3 should be prepared to bet at odds upto 2/3 that their envelope contains the higher amount.
Here is where my radical addition to decision theory comes in to play. Evidence only licences bets with stakes up to the cost of the information. Different probability assignments at different stake sizes are permissable. The first 50/50 probability assignment was free, so a rational person shouldn't risk anything on either X=A or y=A. Player S2 got to be in ES2 at a cost of £10 (it must have cost somebody x, money doesn't grow on trees) So he is licenced to bet a stake of £10 that £10 = 2y at 2/3. Player S2 is only licenced to stake £5. (the word "only" here is only licenced from ES4. Also she betting that £5 = 2x, at 2/3 which is a different proposition. Take the proposition (x = £10 and y = £5). From E4 this is true and known. From E2 you'd consider a £10 bet at 2/3 fair. From E1 you'd consider a £5 bet at 1/3 to be fair. THIS IS THE SAME BET!
Popper accuses Bayesianism of favouring the more probable hypothesis over the more informative one. From ES2
H1: p(E1 contains twice the amount as E2)= 1/2, is both less probable, AND less informative than:
H2: p(E1 contains £10 and E2 contains £5) = 2/3.
Best stop here.
I'll continue from where my last post abruptly stopped due to sleep deprivation:
We've already discussed the problem of interpreting the probability values, and concluded that (1) Bayesianism is useless here due to the simple fact that the envelope choice is a one-off, so no prior/posterior probabilities can be updated to converge on a more or less 'objective' value.
I concluded that the only feasible choice for probability lay in logical probability, viz., that as there are only two possible choices, envelope A or envelope B, it followed that there is a 50/500, or 0.5 probability of selecting the envelope with the greater amount.
So far so good.
The real problem lay in the interpretation of the utility function associated with each choice.
As the whole problem seems to hinge on the fact that the utility possibility is greater for the unchosen envelope each time, the agent must change his mind and select the other envelope. But then the same line of reasoning strikes him and he is forced to change yet again, ad infinitum, or more accurately, ad nauseum.
So, if you can find an appropriate utility function then you'll solve the problem.
Good luck!
R.
Bayes Postulate is that prior to information probability should be distributed evenly over the logical space. Bayes postulate is not essential to subjectivism in general and should be distinguished from Bayes theorem which tells one how to update beliefs given new information. Bayes postulate is useless in this case because the possibilities are unbounded: there could be any amount in the envelopes. Bayes theorem is consequently also useless since it requires a prior probability distribution.
R's suggestion of using "logical probability" is unmotivated and therefore has no prescriptive force if there is an alternative and more informative distribution. There is one: the probability that the other envelope contains 1/2X given the first envelope contains x is 2/3, and the probability that it contains 2x given the first envelope contains x is 1/3. This was done by recognising that indifference is the best attitude and using Ramsey's formula for measuring degrees of belief given indifference to extract the best degrees of belief.
There is no further need to explain why these probabilities are better than 50/50.
According to DeFinnetti Logical probability is based on arbitrary features of our symbolic language. Bayes postulate has no force. Here are some examples.
I roll a dice, I describe the outcome as "D6" if it comes up a 6 and "~D6" otherwise. I don't know whether it will come up 6 or not, so according to logical probability p(D6) = 0.5 and prob (~D6) = 0.5.
But I could just as easily described the outcomes as either D5v~D5. Which by Logical probability yields p(D5)= 0.5 This would make p(D5 v D6) = 1. But we don't know whether (D5 v D6) or ~(D5 v D6) so p(D5 v D6) = 0.5: contradiction.
This problem is completely general and has nothing especially to do with dice.
The mistake is to infer from P(E1 is A) = 0.5 (where "A" signifies the envelope with the most money) & P (E1 contains x) = 1 entails that p (E2 contains 2x) = 1/2.
Here is something interesting. The higher the factor by which one envelope exceeds the other, the greater the probability that the second envelope will contain the lower amount. The formula is :
If one envelope contains n times the other envelope and the envelope you chose contains x, then the probability that the second envelope contains nx =
1/(n+1) and the probability that it contains x/n = n/(n+1).
Just to try your intuitions out, let n = 100. You find £100 in the envelope (real money), You know that the other envelope contains £10 000 or £1 (one envelope contains 100 times the other envelope) Is the probability that the other envelope contains £10 000 1/2? Or 1/101?
I say 1/101.
Jonny
Regarding your comment about the Die,
This has already been discussed by Keynes in his 'Treatise on Probability'.
Such 'paradoxes' arise because we fail to enumerate all the possibilities.
You can of course say that the probability of rolling a six can be formalized as two possibilities: either i roll a six or I don't. So
I have already stated that Bayes is useless here.
Pr(Rolling a Six) = 0.5 and Pr(not rolling a six) = 0.5
But as keynes noted, we have failed to enumerate the other possibilities, viz., rolling a 1, 2,3,4 or 5.
So the logical probability of rolling a six is Pr(1/6).
You write that:
""R's suggestion of using "logical probability" is unmotivated and therefore has no prescriptive force if there is an alternative and more informative distribution.
What precisely do you mean here by 'unmotivated'?
It's motivated by the simple fact that there are only two choices. The extra money is contained in one of the two envelopes. So with two choices, there is logically speaking 1/2 chance you'll pick the envelope containing the larger sum of money.
So please elaborate on how the above line of reasoning is 'unmotivated'........?
You mention Ramsey's Principle of Indifference, and this actually supports the logical probability view, as there are only two choices, each one is equally likely to contain the larger sum of money.
I fear that this discussion is going around in circles, so as much as I respect Jonny's argumentative style [this is a compliment to a philosopher (;-)], I feel I should bow out now.
As Jonny clearly feels confident in his proposal to the envelope paradox I strongly urge him to submit it for publication for a second opinion.
Very best,
R.
Thanks R, just to respond to your question "What do you mean by unmotivated", and to your interesting reference to Keynes: I look at arguments and reasons as a decision process by commitee. In a state of ignorance about a fact P(say the existence of water on Pluto), the argument, or reason that since we know neither p nor ~p we should assign probability 1/2 to either is weak. This isn't to say it is not an argument at all. If another member of the commitee has an argument for a different probability assignment based on knowledge, then it should be enough to defeat the weak argument for even distribution over an unknown partition. To show the weakness of such arguments we can point out that there will always be r such that r entails p but ~r does not entail ~p, and q such that p entails q but ~p does not entail ~q. In this case p(r) < p(p) < p(q). But by hypothesis we don't know anything so we don't know how to distribute the probabilities. (In fact I'm of the opinion that we do not properly know (pv~p)in all cases.)
Keynes can't say that we have "failed to enumerate all the possibilities" because the number of possibilities is unbounded, and is essentially *creative*.
You and I are in a space ship above Pluto and we need water. I ask you what the probability that there is water on Pluto and you say confidently 0.5 since (p V ~p) and both are equally unknown. Keynes says you have failed to enumerate all the possibilities. But what does this mean? Keynes is presupposing that there is an objective fact as to a set of equally probable atomic possibilities. I say this is hubris. Its hard to express all this, so I will quote my new hero De Finetti, who is actually quoting someone else Ulam:
"The indications are.. that there are no atoms of simplicity and, which is most strange, one would almost be tempted to say that in the physical world the set-theory axiom of regularity - that is to say, that every set contains a minimal element with respect to the relation of "belonging to a set" - *does not hold*."
So when Keynes says that one has failed to enumerate all the possibilities, that is because there is no such thing as enumerating *all* the possibilities. Even with a die, we could elaborate on "the die lands 6 face up" by dividing this possibility further depending on the direction of the four corners, the final position of the die, the deceleration on impact, etc, etc,
But aren't I using something like Bayes postulate by calculating the correct degree of belief from indifference using Ramsey's formula?
No, my adaptation of Ramsey's formula means that if you know nothing at all about p, then you won't bet anything on p at any odds, always prefering the p-neutral option of not betting. The notion that there is something special about 0.5 is an accident of logical notation.
I admit though that the probability of taking the envelope with the most money in first off is 0.5. But this is surely just part of the set up.
Anyway, thanks for all your comments and I promise to write it up for publication.
jonny, you repeatedly seem to miss 'R's point!
Your examples actually support his view whilst you seem to incorrectly believe they're in your favour! it's like shooting yourself in the foot and then smiling at the pain!
Your Pluto example falls into the same trap as the one expounded by Keynes, offered by 'R'.
In the case of whether or not there is any water on Pluto can be judged against a large background of astronomical data.
e.g. Astro-spectroscopy has shown that the atmosphere of Pluto seems to exist as a gas only when Pluto is near its perihelion (closest to the Sun).
And it's density is probably a mixture of 80% rock and 10% water ice. Its surface seems to be covered with ices of methane, nitrogen and carbon dioxide.
So we have data for this judgment.
And for other planets further out, we can still made educated guesses with background knowledge consisting of spectroscopy readings regarding its atmospheric composition, and so on.
We can only use the Principle of Indifference when there is no reason to suppose P or ¬P. In such case, both Keynes (and 'R') argue that one should accord each possibility and equal probability.
Later you write that "Keynes can't say that we have "failed to enumerate all the possibilities" because the number of possibilities is unbounded, and is essentially *creative*."
Again this shows that you have simply failed to grasp either 'R's or Keynes's point.
You speak of 'unbounded' possibilities, but this is neither the case with the envelopes or with the die!
There are only six possibilities with rolling a die, two with tossing a coin, and two when faced with two envelopes!
So when you say unbounded I get the worrying feeling that you don't know much mathematics, certainly no calculus,which I find rather upsetting.
Your repeated attacks at Keynes and Bayes Theorem demonstrate that you are not familiar with mathematics, probability or decision theory.
you have failed to grasp the true meaning of Ramsey's PI, Keynes's Principle of Enumeration, the nature of Bayes Law, and, well, your posts speak for themselves.
I agree with 'R', we should stop all this and see what the journals say.
The fact that you have not submitted it and continue to go around in circles here demonstrates that you are not willing to submit it for publication.
I wonder why..... ;p
N.
i don't know why 'R' does this.
he's clearly the same 'D' blogger from another post given his writing style, but always vanishes.
i am inclined to agree with 'N' on this.
if you think you've genuinely found a solution then why are you wasting time on a blog arguing with people who think you're wrong?
why not put your money where your mouth is and submit it to a journal?????
you'll most likely receive a rejection letter with an appended collection of comments detailing where you've gone wrong.
then you can post those too!
You're right, N, I am nervous about submitting this solution for publication, because it appears to you (and therefore probably to editors) that I don't understand mathematics.
The number of possibilities are unbounded when considering a "random" event.
Take the roll of a die. You say there are 6 possibilities, one for each side. What of the possibility that the die doesn't land?
How about the possibility that the die lands with the rows of dots pointing due North? This might be less probable than any face, which would entail that that the six equal chances are not the smallest unit of chance. We must remember that Dice and lotteries and coin tosses are designed so that there is a partition of equal chances. The same is not true of propositions.
You say that "And it's density is probably a mixture of 80% rock and 10% water ice"
Take out "probably" and we have a proposition that obeys excluded middle. Either it is 80% rock and 10% ice or not. Here is where the "unbounded possibilities" come in. Were I to prove there was in fact 11% water, would this negate your statement? Not really, it would show that your informed guess was pretty close. Let us say that on our information we can't tell whether it is 11% or 10%. Should we say that there is a 1/2 chance that it is 11% and a 1/2 chance that is 10%. Well no because we are not enumerating all the possibilities. It could also be 12% or 9% or 10.5% or any number of proportions. There maybe a natural limit to the discrimination of our verification procedures, but we don't know what this limit is.
Suppose that you know the true proportion of ice is within an interval 5% - 15% but you don't know any more accurately than this. Could you devide up the space between 5% - 15% into a number of equally probable possibilities? The probability of any number(say 9.0001030557%) being correct would be 0, so you would have to select an interval. How small would the intervals be? 0 > x < 10%. To enumerate the possibilities would be to decide the value for x. The possible values for x are infinite in number. This is all I mean by unbounded.
Will you tell me what the mathematical meaning is, or how I should express this thought using the correct terminology?
The thought is that when we have no information about what will happen next, we can divide up the possibilities however we like, and this can easily involve an infinite number of possibilities.
Jonny
I think we've just hit on your confusion re. the logical possible equipartition and Ramsey's Principle of Indifference.
You asked about the possibility that the die doesn't land.
Well the point here is that we're enumerating all the possibiltiies.
If one is playing a game where a die is rolled, and for some unknown reason the die miraculously lands balanced on an edge or it never hands, e.g. if it falls down a miraculous bottomless pit.
Now, even if the die, for the sake of argument, did miraculously land on an edge or a corner, then we would simply re-roll the die, because that's not a possibility that's part of the game!
The same applies to tossing a coin. If a coin is tossed and lands on an edge then we would simply re-toss the coin.
so your confusion re. the non-enumeratable unbounded set of possible logical outcomes is based on this fallacy.
Consider the envelopes once again.
two envelopes: two choices.
Now, i see your confusion re. the die and the coin, but what about the envelope?
either you choose one or you choose the other.
you cannot choose both and you cannot choose neither!
So we are building into the problem only two possible outcomes from only two possible choices!
this why the choices can be exhaustively enumerated as two,
and this is why the outcome is 1/2.
Your failure to even understand the rock and ice ratio demonstrates you do not understand chemistry either!
my point about spectroscopy wasn't about it either being 80% rock and 10% ice!
it was about the likelihood conferred upon this proposition fro the atmospheric spectroscopy data!
the spectroscopy analysis of the gases in the atmosphere of Pluto can be decomposed using spectroscopy to indicate the composition of the atmosphere, and from the composition of the atmosphere, namely the chemicals in the atmosphere, one may infer the density of the planet, and so on.
This was a specific example by me to demonstrate the difference in fallaciously applying the PI to such cases and to those such as the die or the envelopes or a coin.
in the case of the atmosphere of Pluto, one cannot simply say Either there is water on pluto or there isn't, as there is scientific evidence for this, according it a statistical likelihood of there being water or not.
but in the case of a die or a coin or the envelope, the logical possibilities are constructed by us!
there are no coins or envelopes or die in nature! they are created by us!
we accord the die six faces by choosing it to be a cube and assigning *, **, ***, ****, *****, ****** dots on each face respectively, representing the numbers, 1, 2, 3, 4, 5 and 6.
and in the case of a coin we assign it the values of 'heads' or 'tails'.
And in the case of the envelopes, we choose there to be only two envelopes.
that is all!
your utter failure to grasp this hideously simple fact i find worrying.
initially i thought 'R' was being lazy by 'bowing out', as he tactfully put it.
i now realize that he wasn't be lazy. he was being pragmatic.
Danny.
Thanks Danny,
There is no confusion on my part. I agree that the probability of selecting a particular envelope is 0.5 for the reasons you said, and I have said this in the original post and repeated it in the comments. There is no need to discuss this since it is part of the set up of the problem.
Now let "x" signify the amount of money in the envelope you have chosen. Let y signify any number. What is the probability that x = y? I state that the only sensible probability assignment is p(x=y) is 0. This is using the Keynes/Bayes principle that you favour. The possibilities are infinite, so if probability is evenly distributed over the possibilities then the probability that the envelope contains any particular amount is 1/infinity which is 0, or perhaps undefined. But this is just as much as to say we don't know how much money is in either envelope.
With me so far? Or do you disagree at this stage? Lets call this "STAGE 1" so if think I've made a mistake, you can direct me to exactly where it is.
Now let us name the amount of money in the envelope with the most money "A" and the amount in the other envelope "B". We know that A = 2B. So the most we can end up with is 2B and the least is B.
Still using x to designate the amount in E1 what is the probability that x = 2B?
p(x=2B) = 0.5, p(x=B)= 0.5
I am sure you agree with this, but in case you don't, call it STAGE 2.
Now let us name the amount in the second envelope "y".
Now whether we open E1 or not, we know that p(y=B) = p(y=2B)= 0.5.
Agree? STAGE 3
This is where I think the mistaken reasoning that generates the paradox lies:
Given that E1 contains a specific amount x, and "y" signifies the amount in the second envelope, then
H1: p(y= 2x) = p(y= 1/2x) = 1/2.
Now I say that H1 is false. In conjunction with the statements before and including STAGE 3 it creates an incoherent set. Exactly why it is wrong is hard to say and I've made a few attempts that I won't repeat here. The fact that it leads to a paradox should be reason enough to reject it. I propose an alternative:
H2 p(y = 2x) = 1/3 & p(y = 1/2x) = 2/3.
H2 is well motivated and doesn't lead to incoherence or paradox. This is reason enough to prefer it to H1.
Lets call this STAGE 4.
Now I take it that what genuine disagreement there is is at STAGE 4, and R, N and the rest prefer H1 to H2, citing Bayes postulate, or Keynes who says we should distribute probabilities equally over correctly enumerated possibilities. Since we don't know whether y = 2x or y = 1/2x but we do know that this exhausts the possibilities then we should assign equal probabilities to each circumstance.
It is at this stage of the dialectic that I am challenging the strength of the Bayes postulate. I won't repeat myself. But denying H1 is NOT THE SAME as denying that the probability of choosing the envelope with the higher amount in is 1/2. This latter probability assignment I AGREE WITH!!!! In fact I use it as a premise in my argument for H2
oh for pete's sake,
you're not "challenging" Bayes at all, as he's got nothing to do with this problem because it's a one-off, strictly not repeatable experiment!!!
one person picks an envelope once only!
right. just submit to a journal and get back to us.
Danny.
Just for the record, I am not challenging Bayes. Bayes is dead and duals are illegal.
Neither am I challenging Bayesianism. To be frank I don't know what Bayesianism is.
Nor am I challenging Bayes theorem, which can be demonstrated deductively to be valid.
Nor am I challenging Bayes' postulate in general. Bayes Postulate assumes the uniform distribution as a representation of knowing nothing.
What I am challenging is R's use of this principle as an argument for H1. The principle is not strong enough to favour H1 over H2. When considering H1 in isolation this principle IS a good reason to distribute your credence evenly. But it is not an argument to adopt H1 over H2 when there are independent reasons to adopt H2, which I have shown there are.
Thanks Danny, it can be helpful to be forced to explain my self.
Dear Mr B.Wall,
I have read your original post and all the comments.
I can't help but feel that you have misunderstood, well, everything.
As 'R's, Danny's and Anon's criticisms have failed to be convince you, clearly you need to submit it to a journal,as this is clearly futile.
Yours sincerely,
'U'.
Dear New U,(I know for a fact you're not the old U)
Your opinion is of no significance.
U
U the fictional-
fictional opinions are irrelevant.
So make a contribution to the debate, state your case, or remain silent.
U2. ;p
See what happens when you leave the kids alone for five minutes, they start squabbling in the playground.
I am of course referring to 'U' and 'U2'.
Clearly 'U' is upset with 'U2' for initially appropriating his/her choice of the alphabet in representing their fictional identities.
But it should be noted that 'U' has not made any comments in this blog post or in any others for a great deal of time, and presumably 'letter to a brick wall', aka 'U2' was not aware of your previous appropriation of this letter, so he/she should not be attacked for this.
It is also important to note that whereas 'U2's comments, despite being somewhat harsh, were at least concerned with the topic in question, unlike 'U's which were non-philosophical and therefore have no place in this, or any, post.
So let's keep it clean people.
post propositions, not personal pedantic problems.
though this particular blog post is pretty much over now while we wait for the BJPS or whatever to get back to Jonny.
Danny.
'U' writes to 'U2' that "Your opinion is of no significance".
What a ridiculously childish thing to say.
This is supposed to be a philosophy blog, so even if we do sling mud at one another in the heat of debate, it's rather bizarre for someone to enter the blog solely with the purpose to inform someone else that their opinion is of no significance.
Very childish,.... and very sad.
So, does anyone have any more ideas Re. the envelope paradox or have we reached stalemate?
N.
Us comments were insignificant because they made no advance on the discussion. Philosophy is not decided democratically and even if it was, anonymous opinions couldn't count.
U the first commenter said:
"I can't help but feel that you have misunderstood, well, everything"
All I meant in my comment was that this expression of feeling has no content, except as an emotional response to a particular person, from an anonymous one. I didn't intend to personally attack on "U", how could I when I don't even know who U is? And since I don't know who U is, why should I, or anyone else be interested that U feels that Jonny has misunderstood well everything.
If Jonny has misunderstood something he should be told what it is. I will give a attempt. He has confused the subjective principle of indifference, which he is calling the Bayes postulate, that an even distribution is the best representation of ignorance; with the Objective PI, which is the claim that two symetrical possibilities have an equal objective probability. To explain this difference imagine you are told that a coin is weighted in such a way that it has 2/3 objective chance of landing on one face and 1/3 third on the other. This is an objective feature of the coin. Now someone tosses the coin and asks you what odds you would consider fair on the outcome being Heads. The subjective PI tells you that your credence should be 0.5. But the objective PI tells you that the probability of heads is either 2/3 or 1/3. As long as JOnny keeps this distinction clear there has been no objection raised to his solution. The interesting thing that emmerges is that it seems rational to have a credence of 2/3 in a proposition that you believe to have an objective probability of 0.5
I think what 'U2' meant,a tad bluntly was that they were fed up with the circularity of the discussion.
'R' pretty much hit the nail on the head before leaving the discussion by pointing out that we can't really re-kick off the debate until we agree on the terms. As Jonny appears to disagree with me (Danny), 'R' et al. on the actual interpretation of the probabilities in question, you can't really debate with someone if you disagree on the interpretation of the probabilities.
This is all too common in philosophy: it doesn't matter how logical the reasoning and argumentation is of both parties concerned - if you don't accept each others basic terms, then there's really no opportunity to argue!
This is a shame, but must be acknowledged.
I suspect 'U2' realised this and sided with 'R', on the presupposition that everyone was following the debate and knew he was referring to this.
Well, that's my charitable interpretation of 'U2'. Though of course his 'letter to a brick wall' style was rather caustic.
But 'U's post was just childish and odd: simply asserting that someone else's opinions were of no significance.
This is both rude and childish and has no place in any philosophical debate.
We're supposed to be philosophers.
So perhaps it would be a good idea if people posting comments should act like it....
I would like to thank Jonny was sparking off the debate with an interesting discussion.
So, what's the next blog post going to be on.....?
Danny.
By the way 'U', it appears you've also completely misunderstood the PI:
you state that a biased (weighted) coin with a probability landing.
You are mistaken in writing that the subjective probability of the coin landing on either side is Pr(0.5), as you have failed to mention whether or not the agent tossing the coin is aware that the coin is biased.
IF the agent is aware of the coin being biased then they too will assert that the coin has PrA(1/3) and PrB(2/3) respectively.
This is the whole point.
IF the agent has no reason to suppose one outcome is more or less likely than the other then they may accord both events equal likelihood of occurrence.
If the coin is repeatedly tossed then the agent will realize that the probabilities will converge on the values PrA(1/3) and PrB(2/3) and update their prior probabilities to posterior probabilities which converge with the 2/3 and 1/3 values respectively.
The point with the envelope paradox is that as this is a one-off case, there is no possibility for the agent to see such a value convergence from PrA(0.5) and PrB(0.5) to PrA(1/3) and PrB(2/3).
So as this is a one-off case, the agent must choose 0.5 and 0.5 for each likelihood.
You write that there is a discrepancy between the objective and subjective probabilities of the biased coin, but have failed to take into account the disanalogy of the coin to the envelope: one-off case scenarios.
You write that "As long as JOnny keeps this distinction clear there has been no objection raised to his solution."
No, this distinction is not relevant to the debate for the reasons stated above by me and by 'R'.
IF the envelope choice is an iterative process for a single agent, or more multiple agents where the previous choices and results are recorded and available for the subsequent agents, then the subjective and objective probability values will converge to a single value, but as it's a one-off, this cannot arise.
There is no such analogy because THERE ARE NO SUBJECTIVE PROBABILITIES!
This is because there are no two separate values for the probabilities to converge to!
The subjective probabilities are identical to the objective probabilities because there is no extra information and no reason to suppose the extra money is contained in one envelope than another!
Imagine that one envelope is red and the other is blue.
Imagine that the individual who placed the money in both envelopes prefers the colour red to the colour blue, so perhaps this is sufficient grounds for thinking that perhaps the larger amount of money is contained in the red envelope.
Or imagine that the money is placed is placed in a red and a blue envelope, as in our last example, only this time the person who put the money in them is chinese and and the agent who is forced to choose an envelope does this on Chinese New Year.
On Chinese New Year people give children "lucky money" in red envelopes, and the colour red symbolizes fire, which according to legend can drive away bad luck.
So, the agent would perhaps be more rational in choosing the red envelope.
However, as both R and myself have already iterated, there is no such information available to the agent in choosing one envelope over the other, there is an equal likelihood that the greater sum of money will be available in one envelope rather than another.
It follows from the PI that these equiprobable choices must be accorded an equal likelihood, vis a` vis, PrA(0.5) and Pr(0.5).
The real problem, as R noted, is not really the probabilities at all: it is the utility values, for these are what give rise to the 'paradox'.
When 'U2' said that that 'Jonny misunderstood, well, everything' he was referring to this , and the following issues:
1) jonny has failed to grasp the full meaning and significance of the PI.
2) Jonny has failed to grasp Keynes's discussion of the PI, where Jonny has misunderstood the meaning of enumerating all possible outcomes. His false dichotomy of the dice being rolled demonstrated this.
3) Jonny's discussion of the unboundedness of the possibilities demonstrated that not only had he failed to fully understand Keynes's discussion from his Treatise on Probability Theory, but he also failed to fully understand basic calculus to do with bounded and unbounded intervals and its relation to continuous and discontinuous probability values.
4) Jonny had failed to fully understand the meaning of probability with his comments on Kolmogorov's axioms of probability when he stated that we should not "[...] assume that we are talking about Bayesian or Kolmogorov conditionalization here", and his further comments later that "So to say that there is nothing to be said about reasoning under uncertainty outside of Kolmogorov's axioms is not only narrow minded but false. Most of our judgements concerning probabilities are made on the basis of evidence where we have no conception of what an unconditional or prior probability could be." Jonny seems to think he's being creative by implicitly dismissing kolmogorov, but this is a mathematical absurdity.
5) the fact that jonny seemed to think that 'R's formalization of the algebraic values of 1/x and x were different to his choice of x and 2x demonstrated that his understanding of ratios was worryingly wrong.
6) Jonny also writes that "The possibilities are infinite, so if probability is evenly distributed over the possibilities then the probability that the envelope contains any particular amount is 1/infinity which is 0, or perhaps undefined. But this is just as much as to say we don't know how much money is in either envelope."
Again, he has failed to grasp calculus and the mathematical notion of limits, bounded and unbounded sets and series.
Very worrying.
And the fact that he asserts that the probability assignments for the envelopes can be 1/infinity and that the amount in the envelopes demonstrates that he doesn't fully grasp the nature of the problem.
I think that 'U2' simply got fed up and got out like 'R' before him.
And the fact that both 'R' and I have suggested that Jonny simply submits his 'proof' to a journal.
Both the subjective and objective probabilities of it being accepted are PrA(0.0), because it is wrong. But it would nonetheless be worthwhile submitting, if only to end this rather futile argument as an expert would be forced to reply to Jonny explaining why it is a fallacious proof, as he clearly isn't listening to us.
It is for these reasons that i feel this discussion is going nowhere, so I'm saying goodbye now too.
I'm just back from the BSPS conference where the paper before mine was on the Principle of indifference. Karl Popper and Gillies (2000) have both expressed the view that PI has only a role in conjuring probabilistic hypothesis, not in justifying them. Many wish to eliminate PI because of Bertrands paradoxes. This is pretty much what I said in response to R's evocation of the principle. I've asked Sorin Bangu to send me an abstract to post up, which hopefully he will do. So in accusing me of misunderstanding the nature of PI you are also accusing Popper and Gillies with the same charge.
Lewis in "debugging Humean supervenience" points out that symmetry is always trumped by frequency information, and I think Danny is inverting this and saying that in the absence of any frequeny information PI must hold, and since the envelopes are a one off case, there can be no frequency information. But I am not appealing to frequency information. I am appealing to the fact that "x" refers to two different quantities.
Danny say in point 5 that 2x and x is algebraically the same as x and 1/x. This is after my clear counterexample to show that these ratios are different. I will repeat it here. If the envelope you open contains £10, the other envelope CANNOT contain £1, so if x = £10, then the other envelope CANNOT contain 1/x. This ratio, if anything is a more general formula to describe the type of ratio which in this special case x = 2. So eg if x = 3, then one envelope would contain three times the amount of the other envelope. If this was the case, then my solution generalises so that the credence you should have in E2 contains A is 1/x+1 and that it contains B is x/x+1. If this was what was meant, it was unnecessarily confusing since "x" was already being used to signify the amount of money in the first envelope, and also the difference in value between the two envelopes. I tried to seperate these two values out to explain the false paradoxical reasoning. It doesn't help when people anonymously call me a sub infant for trying to restrict further confusion.
As for my use of "unbounded", I gave ample opportunity to have it explained to me why the unknown quantity of cash in either envelope is not unbounded, or why it is a mistake of some kind to call it unbounded. I got no reply, just further insults and the repetition of the already universally agreed point that the probability of choosing one envelope out of two IS bounded and is 1/2. I have agreed with this from the start, not just of the comments but of the post.
On the Kolgomorov contention, I am not challenging probability calculus or anything like that. The reasoning used in the paradox leads to a contradiction, so in defending the reasoning you have a prima facie problem. My reasoning does not lead to contradiction, so is at an advantage. All I meant by saying that it wasn't Kolgomorow conditionalisation was that it wasn't worked out on the basis of prior probabilities, because this information is not available. In the same way, the initial reasoning that gets us to the probability that E1 contains the higher amount is 1/2 is also not Kolgomorov conditionalisation. It is something like this
P(pick envelope containing x) | (pick from 2 identical envelopes one of which contains A and one of which contains 1/2A). This could be any value between 0 and 1, to get the value 1/2 we need other assumptions. You are happy to conditionalise here without knowing the prior probability of (pick envelope containing x), so you are happy to have conditional beliefs without using Bayes theorem of Kolgomorov conditionalisation. If you are licensed to do so, why can't I be?
I was quite hurt by the first of the Us comments. I anticipated some hostility when posting a unique solution to a paradox. To say that I don't understand anything is pretty well the worse insult I can imagine since I cannot accept it without doubting my very relationship with the world. I've made an effort to answer all the comments with politeness and understanding. I admit that sometimes I don't express myself very well. I also admit that the false reasoning involved in the paradox is very seductive. "probability" is hard to interpret and there are many different meanings so discussions can become confused. But the kind of interpretation necessary for the envelope problem is straightforwardly subjective since the probabilities are used to calculate expected utility. The value "x" such that the envelope that you hold contains x and the other envelope either contains 2x or 1/2x is a strange value, so R is right that it is the utilities that are the problem. However, when we are clear that we are talking about subjective probability we know that probabilities just are the adjustment made to the expected utility of an outcome due to its uncertainty.
As for the second U, (the fictional), I am interested in the example you gave. It seems in the envelope case, the objective probability that the envelope contains the amount it contains is 1, not 1/2. It is determinate. So in saying that the best credence is 2/3 that it contains 1/2 x is another case where the credence doesn't = the objective probability. Here's the real riddle. If an outsider were to bet on the whether the second envelope contained more money or less, it seems that he should bet at odds of 1/2. And this is the case if he has access to EXACTLY THE SAME information as the person making the choice between KEEP and SWAP.
Oh come on, this isn’t school and we’re not in a playground: by simply saying that one of the big kids, (Gillies) has said one thing that you’ve interpreted as supporting your view, is rather weak by anyone’s standards.
Oh, and Karl Popper is dead, so I’d be very interested in how you met him at the BJPS.
I’m not sure what you mean by “conjuring probabilistic hypothesis”, as this is precisely what you’re doing with the envelope paradox: you’re assigning probability values via a hypothesis!!!
The frequency issue you’ve taken from Lewis is correct, but is concerned with a different symmetry to the one we’ve been discussing. We were discussing equiprobable logical possibilities, not symmetry at all. The fact that you seem to have equivocated these two terms as synonyms is rather worrying and further explains your earlier confusion.
‘R’s and my discussing was precisely that no frequency information was available in the first place!
I can’t help but feel that we’re both reading utterly different blogs that, due to some server html error, each of our respective posts appear on different blogs!
I utterly give up now as I can see this isn’t getting us anywhere.
If you’re so confident you’re right then why are you wasting time on a blog with a readership of about ten (maximum) when, if you have indeed ‘solved’ the envelope paradox then glories await you through the journals?!?
I think we all know the answer to that one don’t we……?
The truth of the matter is that you have been extremely polite and gentlemanly throughout this discussion, and for that I applaud you. However, it’s extremely exasperating for the rest of us when you fail to grasp the issues and repeatedly, well, repeat yourself without understanding where your faults are.
I wish you and ‘U’ the very best of luck.
Danny.
Danny
I find your aggressive tone completely inappropriate. And I am utterly amazed that you have the brass neck lambast Jonny for 'schoolyard behaviour'!
Stick to philosophy, you child.
Hazel
I find this to be a very lengthy debate over a paradox that is generated by a simple algebraic error.
The one envelope problem
Lets start with the the simple scenario of one envelope with either £5 or £20 in it chosen at random with 50:50 probability. I hope we are all agree that we would happily pay £10 for that envelope. In fact we should happily pay £12.49.
I only mention it because I think we all confuse the one envelope problem with the two envelope problem. Note that in the one envelope problem the absolute size of the win when you are lucky is £10; the absolute size of the loss if unlucky is £5. This disparity is not present in the two envelope problem
The two envelope problem.
Assume we have one envelope with £10 and one with £20.
You are holding one envelope, deciding whether to switch. There are two scenarios:
1. You are about to give away £10 and get £20, a profit of £10. This scenario has probability .5
2. You are about to give away £20 and get £10 a loss of £10. This scenario has probability .5 also. The loss is the same magnitude as the gain.
The overall utility of switching is therefore zero. Easy
It is the alegabra which leads us astray.
We tend to express the utility of the other envelope as .5*2x+.5*.5x=1.25x
This is wrong. Its wrong because all the x's mean different things. Assuming the set-up is that the envelopes contain an amount of money £x and an amount of money £2x, there is no scenario in which you get £0.5x. Probability of this is zero. It's just sloppy algebra. The correct way to count scenarios is as I set it out above.
At last! Someone who agrees with the first part of the proof. Thankyou, man from Heathrow airport. "x" is obviously a strange value, but the false reasoning does seem to be persuasive and pervasive. Try and explain where the reasoning goes wrong that if I find £10 in my envelope there is a 0.5 probability that the other envelope contains £20.
Dear Danny,
I deleted a couple of your comments and I want to explain. I wanted to give Man at Heathrow airport a chance to move the discusssion on rather than be swamped by a squabble. Also you didn't really add anything that wasn't in your last undeleted comment. Your point was that you thought my reference to some current debate about the status of PI was argument by authority and therefore childish.
In general I agree that argument by authority is weak and should stand aside in favour of reasoned argument. But in this instance I was accused of "misunderstanding the nature of PI", so I thought that an indication that the line of argument I was offering was actually quite mainstream and had been put forward by Karl Popper in 1974, Gillies in 2000 and reviewed this year in the BSPS conference was quite appropriate. I didn't mean to imply that I had met Popper or Gillies at the conference, or that the fact that these people had argued a similar stance somehow means that it is certain that I am right. It was just supposed to indicate that, contrary to your repeated suggestion, I know what I am talking about, and that my point didn't of itself demonstrate a misunderstanding. I promise not to delete any further relevant comments of yours and I recognise that your comments so far have been sincere and to the point. Believe it or not I have revised my views regarding the two envelope paradox due to some of your comments so thank you for that. I intend to post the abstract of the talk on PI that I was refering to quite soon, if your are interested.
If you open your envelope and find a £10 the probablity that the other now contains £20 is .5. You are now playing the one envelope game and should switch. Opening the envelope changes the game.
I am afraid you are wrong about the 2/3 1/3 stuff
I think it is easy to see how this happens with finite sets of possible amounts of money
Lets imagine money only comes in £2, £4, £8 and £16 for the sake of argument.
In the two envelope game the possibilities when you grab an envelope are that you have
2 and the other is 4
4 and the other is 2
4 and the other is 8
8 and the other is 4
8 and the other is 16
16 and the other is 8
all equiprobable ex hypothesi and in matching pairs so that the utility of switching is zero=1/6(2-4)+1/6(4-2)+1/6(4-8)+etc
Once you open an envelope and find say £4 the universe of possibilities over which we compute is cut to two (the other envelope must contain £2 or £8) and the utility of switching is positive because this is the one envelope game and the potential profit exceeds the potential loss
I lack the math to show this works where the envelopes can contain any positive amount of money i.e. the set of scenarios is infinite. One can see intuitively that the infinite set pairs off the way the finite one does but I know my technical limitations.
Perhaps somebody else can fill in the last step. Then I think we really have solved the problem
I was thinking like that until a couple of days ago when I realised that other possible pairs of envelopes are irrelevant. Here's why: imagine this 4 envelope game with four players. The player are split into two pairs, A1 and A2, and B1 and B2. Each pair is given a pair of envelopes where one contains 2 times the amount as the other. So we have 4 envelopes E1, E2, E3, and E4. None of the players are told how much is in the envelopes. But they know that each pair has one with twice the money as the other, as before. They also know that the highest amount in one pair is equal to the lowest amount in the other. So the distribution of cash in the envelopes is 2x, x, x, and 1/2x. Its just they don't know the value of x.
What is clear from this game is that if all the players SWAP, the total cash they end up with is the same as if they all KEEP. This is true even if they know the amount in their own envelope. So by opening the envelope it can't be that the expected utility of SWAP becomes higher than that of KEEP. But in this closed game, each player, when they open their envelope and finds any amount in it, lets say £5, knows that the other envelope must contain £2.50 or £10. But they don't know whether £5 = 2x, x or 1/2x.
The solution to the paradox is to do with the way in which you count the utilities. The paradox is generated by ignoring the fact that if you unluckily chose the lower envelope, you are betting less money, but are bound to win.
Hi Jonny,
Well, let's say that someone puts one gram of pure gold in one envelope, two grams in another and so on up to one kilogram (approx) of pure gold. Two adjacent envelopes from this sequence are chosen randomly and presented to you. You pick one of the envelopes at random and find that you got some amount of gold that is neither one gram nor one kilogram of gold. Now you get the opportunity to take the other envelope instead. It would be stupid not to switch to the other envelope, wouldn't it? Honestly, wouldn't you too take the other envelope instead?
Cheers,
Nick
Thanks Nick, the set up you required is ambiguous. What does the envelope with the most gold in it contain? Is it 1 Kg or 2kg?. If it is 1Kg then the total possible gold in both envelopes is 1.5kg. Consequently we know that for all values between 1g and 1kg we find in the first envelope, only half can be doubled. (eg, if we found 0.9kg in the envelope, we would know that the other envelope contained less. since if it contained more, it would contain 1.8kg, which is above the maximum set. This is true for all values above 0.5kg, which is half of all the cases). Whereas ALL the values between 1g and 1kg can be
halved. Therefore, if all we know is that the envelope contains some figure which is more than a gram and less than a kg, then we can show mathematically that the probability that the second envelope contains half is double the probability that it will contain double. Since these possibilities are mutually exclusive and jointly exhaustive, the probability of finding more gold in the second envelope is 1/3 and the probability of finding less is 2/3. Therefore I would be indifferent whether I swapped, or kept.
But if you allow that there could be an envelope containing 2kg of gold, then it is clear that you should always swap for the reasons given in the original paradox. But this is because the set up doesn't allow for the possibility that when presented with two envelopes containing 1kg and 2kg, you picked the one containing 2kg. So this set up has already presupposed that you were more likely to choose the envelope with less gold in the first place. Consequently there is no paradox in the conclusion that it is advantageous to swap.
OK, I'll explain in more detail what scenario I have in mind.
Suppose I have 11 envelopes with 1, 2, 4, 8, 16, 32, 64, 128, 256, 512 and 1024 grams of gold respectively. I roll a 10-sided die and let the result decide what pair of envelopes to pick, i.e., if I get 'one' I take the first pair {1, 2}, if I get 'two' I take the second pair {2, 4} and so on.
i present this pair of envelopes to you. You pick one envelope at random and find that it's say 64 grams of pure gold in it. Since you know according to what stochastic rule I picked the enelope pair you can now do some standard probability calculations.
It is totally uncontroversial that the conditional probability is 1/2 that the other envelope contains 32 grams of gold. Likewise, the probability for finding 128 grams of gold in the other envelope is also 1/2. The probability for any other amount of gold in the other envelope is zero.
Note that we are safely within Kolmogorov territory here, so we are not free to invent any probability assignments.
Faced to a situation like this all people I know would accept the offer to take the other envelope instead. What would you do? And why?
Yes Nick, in that situation I would choose to swap envelopes in every case, whatever I picked first. (except for 1024g, which is over 1kg)
Notice though that with this distribution, the chance of picking an envelope with more than xg of gold is inversely proportional to x. Furthermore, if we were to find the average amount of gold in all the envelopes, we could calculate that there are more envelopes with less than the average amount than there are with more than the average amount. This ratio will tend towards 2:1. In your scenario the average is 186.0909, so the probability of choosing an envelope with below average is 8/11. This goes towards showing why swapping is advantageous in this case. You already start out with approximately 2/3 chance of a below average draw. Whatever value you pick first already eliminates many more worse outcomes than it does better outcomes for swapping, so it shows why swapping is a better tactic. It also shows why it is not a good model for the original problem. In your set up it is impossible that the envelope you picked first contains 3g, 5g, 6g, 7g, 9g,10g,11g,12g,13g,14g,15g,17g,18g,19g,20g,21g,.....etc.
But why should this be? The point is that you are not supposed to know the amounts in the envelopes.
I hope this answers your question
Incidently, by choosing to exclude 1024 from the first choice it is clear where the shortfall of expected utility comes from.
If x = 2 EU swap - EU keep = 0.5
If x = 4 EU swap - EU keep = 1
if x = 8 EU swap - EU keep = 2
if x = 16 EU swap - EU keep = 4
if x = 32 EU swap - EU keep = 8
if x = 64 EU swap - EU keep = 16
if x = 128 EU swap - EU keep = 32
if x = 256 EU swap - EU keep = 64
if x = 512 EU swap - EU keep = 128
the sum of these utilities is 255
But in the case where you pick an envelope with 1024, you can only lose so if
x = 1024 EU swap - EU keep
= - 512
We can normalise this by multiplying each eventuality by 1/11, but this doesn't change anything. the result is that the over all expected utility of keeping is twice the expected utility of swapping.
OK, good. So you admit that in this scenario the probability that the other envelope contains 32g or 128g of gold is 1/2 each, given that you found 64g in the first envelope you picked? And based on this calculation it's better to take the other envelope, right?
Yes. Just to make it clear what I am agreeing to: if you had a 1/20 probability of picking an envelope containing 64g of a pair containing 96g and a 1/20 chance of picking an envelope containing 64g of pair containing 192g, and 0 probability of picking an envelope contain 64g of any other pair, then conditional on picking an envelope conatining 64g, you have a 1/2 probability of doubling your money on swapping and 1/2 probability of halving your money. The reason this is not a good representation of the envelope paradox is because you have a 1/20 chance of picking an envelope containing 512 of a pair containing 1536g but no possibility of picking an envelope containing 1024g of a pair containing 1536. It may seem like screening of this one possibility should matter, but it does, because the amount lost in this one swap is more than the sum of the amounts gained by all the other swaps put together.
OK, good. Let's say we have another scenario where you don't know how the amounts in the envelopes in front of you are determined. However, your best friend saw it all. You pick one envelope and discover that it contains 64g of gold. Now your freind, whom you trust 100%, tells you that it is exactly 50/50 chance that the other envelope contains 32g or 128g of gold. What will you do? And why?
Hmm, I'm a bit wary here. There is an element that I'm not sure about to do with third party onlookers. But if the probability is .5 that I double my money and .5 that I halve it, then my degrees of degree should be .5 either way and I should swap.
Just to be clear, I would swap, since if I keep, my expected gold is 64g, whereas if I swap my expected utility is 128/2 + 32/2 = 80g.
However, since I've been obsessed by the 2 envelope paradox for so long now, I might not be inclined to give total credence to my friends probability assignments without some justification
In fact, given my broader picture, I would require that he guarantee his probability functions to the degree of certainty appropriate to the risk. The best way to do this would be to sell him the option to swap for 80g of gold. If he was prepared to pay me 80g of gold in return for what ever i got for swapping, then I would believe his probability assignments, but then I wouldn't need to since I could just take his 80grams of gold and let him deal with the risk. If he had faith in his own probability assignments he should not have a problem with this. If he was reluctant however, then I may be inclined to take the 64g and pass up the gamble. However, if I was a big investment banker, I'd probably take the gamble and swap, take a commission based on the expected utility, and pass the risk on to the investers, and ultimately the economy. If everything went wrong, I'd simply get a big reduncy settlement and retire
OK, good. Let's say we have a third scenario where you, as before, don't know how the amounts in the two envelopes are determined. Only your best friend saw it all. So far exactly as before.
One good thing about philosophy is that we can invent people and situations that doesn't necessarily exist in the real world, just for the sake of the argument. That is the whole point with thought experiments.
Thus, we will assume once again that you totally trust your best friend that saw it all. If you actually have such a friend or not doesn't matter at all here. Furthermore, we will assume that you have a twin brother. Once again it's totally irrelevant if you actually have a twin brother or not.
As before you open the envelope you picked at random and you find 64g of gold in it. You look at your best friend and he tells you that it's exactly 50/50 chance that the other envelope is double or half of what you have. So far so good.
But now you see how your twin brother opens the other envelope. He also asks your mutual best friend if your envelope contains double or half of what he has in his with an equal probability. Your mutual best friend confirms that to him too.
You are now free to swap envelopes with each other, if you both agree to do that. What will you tell your twin brother is the best thing to do and why?
Good, thanks Nick, now we have a scenario where my solution really comes into its own. What my friend says, is, bless him, incoherent. He is telling my twin that my envelope has only a 50/50 chance of being 64g. But I know that it is 64g, so I can dismiss these probability assignments as manifestly false from my perspective. Also, if his probability assignments are correct then I end up with an intransitive preference since it seems that I must prefer swapping to keeping and keeping to swapping. So the probability assignments are false. You can't say that it is a feature of the thought experiment that they are true if they are incoherent. Either my friend can see what is in both envelopes or he can't, if he can, then the probability assignment should be 1, if he can't then I have more information than he has of the chances, so I should ignore his advice.
I recommend to my twin that he gives me half his gold and I give him half mine. In this case we both end up with the same amount: 32g + 1/2 whatever he has in his envelope.
If this is not allowed I recommend to be indifferent, this is because we are in an exchange situation where we have one oppourtunity to exchange 1 : 1 and thereafter we will be able to exchange at 2:1. So my 64g share is half the total now and will be either 2/3 the total or 1/3 the total after we swap or keep. Whether we keep or swap the total will be the same, since swapping makes no difference to the total. If we keep or swap, one of us will lose 1/3 of the total and one of us will gain 1/3 of the total. But another way of looking at it will be that the winner is twice as well off, and the loser is half as well off. If we look at it in this way it seems that swapping is an advantage. But the twice as well off person has twice of 1/3, while the 1/2 as rich person has 1/2 of 2/3, so they both only lose or gain the same amount: 1/3.
Wouldn't it be good to have a formula and a way to work out the expected utilities in decisions of this kind? Well, we have one: Ramsey's formula. It tells us to assign a degree of belief 2/3 to halving our money if we swap, and 1/3 to doubling our money. Ramsey's formula isn't tailored made to the 2 envelope paradox but is completely general, more general in fact than Bayes theorem or Kolmogorov's axioms. The selection process of the envelopes and the way in which they were filled becomes irrelevant once it becomes certain that both envelopes contain the amounts that they contain. Probability in this situation is purely a tool for calculating expected utility. To make this stark in its contrast to lazy realist ways of looking at probability, imagine that I assigned inverse utilities to the amounts of gold, so 32gs would be twice as valuable to me as 64g, and 64g twice as valuable as 128g. In this case, I should assign probability 2/3 to getting 128g and 1/3 to getting 32gs! Someone who is benightedly realist about probabilities might find this unpalatable. But the envelopes are sitting there, the second envelope has already got a certain amount of gold in it! It is not in some quantum state of undecidability. The only sense to be made of probability in this context is how much uncertainty we have, and how much we should be prepared to risk for what return.
Aha OK, no one wants to be incoherent, right?
So, I will change the last scenario just a little. Instead of one mutual best friend you have two different best friends there. They both see how the amounts in the envelopes are determined. Your best friend see what you have in your envelope (64g of gold) and tells you that it's really a 50/50 chance that the other envelope is half or double of that. This is so far exactly as scenario two.
But now your twin brother will experience basically the same thing. He will open his envelope (containing 32g or 128g of gold, you don't know which of course) and his best friend (not the same guy as your best friend now) will tell him that he has indeed a 50/50 chance to expect double or half of that in your envelope.
His best friend doesn't know what you have in your envelope, as well as your best friend doesn't know what's in your brothers envelope. Now, will you recommend your brother to swap with you or not?
Hmm, I'm not sure that makes a difference. The point is I know how much is in my envelope, and my twin knows how much is in his. If we both know that a swap will involve a doubling for one of us and a halving for the other, then we should be indifferent. The trusted advisors are of no use.
Aha OK. To sum it all up: If you don't have a twin brother present you are prepared to pay some money for the opportunity to take the other envelope instead of the one you picked, which contains 64g of gold (scenario 2). But by simply imagining that you have a twin brother present, looking in the other envelope before you do, you are suddenly not prepared to pay anything at all for switching to that same envelope (scenario 3).
Please forgive me for pointing this out, but I think you have an incoherent set of preferences here.
That's hardly fair. My preferences aren't incoherent:
If the choice is
Option 1. 64g for certain
Option 2. 32g if p and 128g if not p where my degree of belief that p is 0.5 then I would choose Option 2.
But in the 2 envelope problem this is not the model. You can't force me to have a degree of belief 0.5 and then accuse my preferences of being incoherent. My preferences are coherent and my degree of belief that p is 2/3. It is up to you to persuade me that my degree of belief should be 1/2. Pointing out that if my degree of belief was 1/2 then I would have incoherent preferences is playing into my hands.
I went some way to explaining in a different way why 0.5 was the wrong degree of belief to have in the limited case, but this is hard to do on a case by case basis. I'll try again.
there are 4 pairs of envelopes, 1.2.3 and 4.
1 contains (1g,2g)
2 contains (2g,4g)
3 contains (4g, 8g)
4 contains (8g, 16g)
Now you may argue that if I chose an envelope at random and then discovered that I had 4g, then the correct degree of belief that swapping would get me 8g is 0.5. Maybe I would agree with you. But if we know all this, then we know that this was only a 1/4 chance. I had a 1/8 chance of choosing 16g and losing 8g on swapping. The only way I could get this back is another 1/8 chance, that I chose 8g first and swapped up. The picture I'm trying to get across is that because of the nature of the case, the pay offs of the swap are always counted half a rung higher than the pay offs of keep. Swapping appears psychologically to allow you the chance of jumping to the next pair of envelopes up. But if you were in the next pair of envelopes up, then keeping would be just as profitable as swapping if you were in the lower pair of envelopes.
Do you see how hard it is to explain on a case by case basis!
Okay, so me and my twin pick a pair of envelopes. My envelope contains 4g. Should I swap? Yes, since the average in an envelope picked at random is much higher than 4g, my first choice was unlucky. Should I advise my twin to swap? Well, yes if has 2g, and no if he has 8g. But can I give him general advice, what degree of belief to have that swapping will do him good? Well yes, if he has 2gs he will gain 2gs by swapping, and if he has 8gs he will lose 4gs by swapping, so he should have twice the degree of belief that he will gain by swapping. Since he can only gain or lose, he should have a 2/3 degree of belief that he will gain by swapping a 1/3 degree of belief that he will lose by swapping.
Do you see now?
This reasoning works for any finite set of pairs.
1. I never forced you into anything.
2. I think it's obvious that you are contradicting yourself.
3. Proposals for new ways to calculate conditional probabilities that obviously differ from the standard way has to be explained in a much more elaborate manner to be taken seriously at all.
1. You wanted to force me to accept the probability assignments were 50/50 by asking me to trust the testimony of a friend who "knew" the probability assignments. I resisted this, but went along with you to see where you were going, the fact it ended in a contradiction is merely a reductio that these probability assignments can't be correct.
2. I only contradict myself if I have a degree of belief 0.5 that the other envelope contains double the amount in my envelope. More grist to my mill.
3. This sound interesting, but I don't quite know what you mean. What is difficult to explain is why anyone should have a degree of belief 0.5 that the second envelope contains 2x given that the first envelope contains x. This conditional probability certainly isn't using Kolgomorov conditionalisation because there is no way to normalise the prior probabilities. The fact that it seems intuitively right to practically everyone doesn't mean that it conforms to the axioms of probability theory.
On the other hand it is very easy to explain why it should be 1/3 if you are indifferent, and it is also very easy to explain why you should be indifferent. This probability assignment conforms to an existing formula for degree of belief that many agree is the best expression of subjective probability yet published.
My innovation, if anything, is restricted to the observation that probability assignments attached to gains and losses vary with the stake size.
I fear losing you Nick, so I'll try to express the difficult problem you are posing for me and show my solution:
If I was offered a choice of 8 envelopes from 4 pairs,
A(1,2) B(2,4) C(4,8) D(8,16) then it seems perfectly reasonable to suggest that if I found 4 in my envelope then the probability that I had chosen from B = the probabilty that I had chosen from C, which is 1/2.
I can then reason that my twin will either be in a position where the probability that he chose from C = the probability he chose from D = 1/2. This is where he is looking at the 8 in his envelope. Or he will be in a position where Prob(in A) = prob (in B) = 1/2. This is where he is looking at the 2 in his envelope.
Therefore, both my twin and I have a .5 probability of swapping up. and an EU of swapping of 1.25 x the EU of keeping. So there seems to be a contradiction here! Even though the various probability asssignments seem uncontrovertable.
Is this the problem you have in mind?
If so, the solution is to show that we are assuming, in giving the probability assignments to the twin, that the probability that we are in B or C is both 1 and 1/2. Therefore they are incoherent. If it is 1, then the EU for my twin of swapping is 4, since if he swaps he will get 4. The EU of keeping is 8 or 2, since he has by hypothesis already got 8 or 2. So over all he will get a total of 8 if he swaps in every eventuality, and a total of 10 if he keeps. On the other hand we will get 10 if we swap, and 8 if we keep. So we should swap and he should keep. There is no paradox there.
If it the probability that we are in B or C is only 1/2 on the other hand, then we have a half chance of not find 4 in our envelope, which is by hypothesis false. So there is no paradox there either. Either my twin has a fifty fifty chance of doubling his money or I do.
If we want a general answer as to what to do if we find ourselves in such a situation where we don't know what the unopened envelopes contain, then it is clear that even in this limited case we should be indifferent. We have 8 choices in the first place, all that we are equally likely to pick. 4 out of 8 of them we gain, and 4 out of 8 of them we lose. Because of the relationships of the numbers all but two envelopes have an expected utility of 1.25x for swapping, where x represents the number found in the envelope. The utility of swapping in all 6 of these eventualities is (1 + 2 + 4 + 4+ 8 + 16) = 35. The utility of keeping is (2 +2 + 4 +4 + 8 +8) = 28. So swapping has 1.25 times the utility of keeping. But in the remaining 2 eventualities, the utility of swapping is (2 + 8) and the utility of keeping is (1+16). Add these to the other utilities and the total utility of keeping and swapping in every eventuality is the same at 45. So in general we should be indifferent, though if we pick 2, 4 or 8 then if we don’t mind risk, we should swap.
The problem you pose is that if me and a twin chose one pair of envelopes such that I got 4, it seems that I should both prefer to swap, and advise that my twin should swap. But this is not the case. I should really advise him to swap only if he had 2, and advise him to keep if he had 8. If I am not to impart such detailed information to him, I should advise him to consider the probability of getting less if he swaps to be 2/3 and the probability of getting more to be 1/3, since if he has 8, he will lose 4 and if he has 2 he will gain 2. So the expected utility of losing on a swap is twice that of winning.
But should I not advise myself to be indifferent also? Well, not if we consider the strategy of a long run where I always know the amounts in all the pairs. In this case, I should be better off if I always swapped when I got 4 and kept when I got 16. But if I applied the strategy generally without knowing the maximum, then I would swap when I got 16 as well and lose all my winning.
Dear Jonny, this problem or paradox is really a very difficult problem to solve and even to fully grasp. But if it makes you (and others) interested in probability theory it's really great and it has then served its purpose, I think.
So I really recommend you to study some probability theory at university level and when you have done that come back to this problem again. You will then be able to really understand the problem and to think about it in a more structured way.
Think about this problem as a parallel to other paradoxes in other fields as for example the twins paradox of relativity theory or the EPR paradox of Quantum Physics. They are all excellent devices for making young people interested in an advanced field of study.
I will not go into your different reasonings in detail because that would only result in me teaching you probability theory in a tedious way. But when you have acquired the necessary skills in probability theory please read my contributions again; they should really contain all information needed to understand what the two envelopes paradox is all about.
Best wishes, Nick
hmmm, I think it would be more gracious if you would admit that by trying to lead me into a contradiction you in fact provided me with a further reductio argument for my case.
Also, the utility of studying probability theory is doubtful given that other than mine there is no solution to the two envelope paradox. It is really a decision theory problem, and it is well known that decision theory has a problem in that it tends to be insensitive to the magnitude of risk. (See Allais). The usual solutions to this tend to try and push the risk onto people's preferences and realign the utility functions.( Ramsey, Savage) But these solutions don't work since you can simply restate the problem using the new utility scale. (See Blackburn Tortoise raising.)
Imagine this simple game of winner takes all. You and your twin select a card from a normal deck. On looking at the card you can either
1. Elect to play by putting a pound per stud (£2 for a 2, £11 for a Jack etc.) in the pot.
Or 2. Not to play.
If both players elect to play, the player with the highest card takes all the money.
Suppose you got a Queen, would you play? Yes. Suppose You got a King, would you play? Yes. Suppose your twin got a king, would you advise him to play? Yes. But what if you got a queen and he got a king? Surely you would not advise both him and yourself to play! This may seem like a contradiction but of course it is no contradiction at all. You are giving different advice from different information states.
I don't see how your envelope game differs from this in an important way. You have 20 choices, 18 of which give you a 50/50 chance of doubling your money. But the money you have a chance of doubling also increases up the scale. So the benefit of keeping is absolutely better the higher the amount you picked first. Therefore if all you know is that you've got 64, you should swap, whereas if all you know is that you might have 32, or you might have 128, but by swapping you will get 64 for sure, then you should keep.
If by studying probability theory I will lose the ability to reason in this way, then perhaps I shouldn't bother. You should perhaps be excecuted for corrupting the youth with your sophistry that makes paradoxes out of straightforward decision problems.
Perhaps you should think long and hard about why people are rationally less inclined to gamble the more is at stake, then reread my responses to your comments and you shall see the wisdom of what I say.
Best Wishes,
Jonny
Dear Jonny,
Please understand that I only tried to lead you into a contradiction as a way to show you how the paradox appears. You have not yet understood this paradox and I just wanted to help you. I failed in your case but hopefully other readers here will understand what it's all about.
How you think it's possible to solve a problem in probability theory without even a basic knowledge of the subject matter is beyond me. A problem isn't solved just because you don't understand or see what the problem is.
Philosophy means love to wisdom and knowledge. Your anti-intellectual attitude isn't graceful for anyone calling herself a philosopher.
Cheers, Nick
You asked me to assume the probability that getting the higher amount was 0.5.
You then showed that this leads to a contradiction.
How is this not a reductio ad absurdum argument to show that 0.5 probability must be wrong?
You appear to suggest that if I don't accept that the "correct" or "objective" probability assignment is 0.5 then I have no understanding of probability theory. But your argument that this was the correct probability assignment ended up being that I had to accept it on trust from a reliable friend who could see everything. In refusing to accept this I am not being "anti intellectual". If a friend was reliable and could see everything, then he would know that I had 64g in my envelope.
Just to remind you, you wrote
"Your best friend see what you have in your envelope (64g of gold) and tells you that it's really a 50/50 chance that the other envelope is half or double of that. This is so far exactly as scenario two.
But now your twin brother will experience basically the same thing. He will open his envelope (containing 32g or 128g of gold, you don't know which of course) and his best friend (not the same guy as your best friend now) will tell him that he has indeed a 50/50 chance to expect double or half of that in your envelope."
So, if you accept that both envelopes "really" have a 0.5 chance of being doubled on swapping then you are led into a intransitive set of preferences.
Conclusion: They don't "really" have a 0.5 probability of being doubled on swapping.
I find myself getting a bit offended, but I am sure this is just a feature of blogging. You have been very polite, but I don't feel you have given my solution due consideration. Also, I'm writing this between taking phonecalls, so it's bound to be a bit unstructured and I probably have contradicted myself. So sorry. I am certainly not failing to see the paradox though. Have a look at the allais paradox and see if you can see the relationship I'm making.
Here's a way of looking at it:
MOVE 1: Solution to two envelpoe paradox, the chance of doubling your money is 1/3, the chance of halving is 2/3. This isn't incoherent because these probabilities depend on only knowing the amount in one envelope and they are completely uniform.
OBJECTION. But hasn't Chalmers shown that you can have a tapered distribution where the probabilities are 50/50 for all but the end cases? And in these distributions doesn't the envelope paradox still arise? In which case changing the probability assignments won't work. Eg imagine selecting from pairs (1,2;2,4;4,8;8,16;16,32;32,64;64,128; 128,256; 256,512; 512,1024).
MOVE 2.
Either you know this in advance, in which case, you know to swap if you get anything under 1/3 of the sum of the highest two envelopes, which is in this case all but 1024. Or you don't know this in advance, in which case you may have chosen the highest amount and swapping would result in a loss compared with keeping that was equal to all the expected gains of swapping the other envelopes put together. So if you don't know the distribution, then indifference is still the best attitude.
OBJECTION 2.
But what if you do know the distribution and you find an envelope safely in the mid range, say 64? Doesn't the paradox still arise, though it seems that the probability of getting the higher envelope MUST be 0.5.
MOVE 3.
No, I have already said that in this case you should swap, since you have less than 1/3 of the total amount in the highest pair of envelopes.
OBJECTION 3.
But suppose you had a twin who had the other envelope of the pair that you had? Wouldn't you advise both yourself and your Twin to swap? In this case, wouldn't you be incoherent, and the paradox still arise?
MOVE 4.
No, the paradox doesn't arise in this case because in the case where I have 64g and he has 32gs, then he gains 32g by swapping, whereas in the case where he has 128g, he loses 64g by swapping. So he loses twice as much as he gains by swapping. In general, he will lose twice as much as he gains. So he has a 2/3 probability of losing if he swaps and a 1/3 probability of gaining. So he should not swap. Conclusion: I should swap, and he should not, no paradox there.
OBJECTION 4:
But you agreed that someone should swap if they got any amount less than 1/3 of the total. Now you admit that if you get 32, or 128, then you should keep. You have contradicted yourself.
MOVE 5. I only said that my twin should keep on condition that I had 64g in my envelope. If I didn't know this, then i would advise him to swap.
OBJECTION 6.
Suppose you had two advisors, one who told you that you had a 0.5 probability of getting double if you swapped, and one who told this to your twin. They worked this out by knowing the distribution of the envelopes.
MOVE 7. This is interesting and difficult. A flicker of the paradox does emmerge here it is true. But is it just a mirage? The advisors are working out the conditional probability of getting 2x on swapping from the initial set up. But they are ignoring the luck involved in the initial choice. Suppose I got 512g on my first draw. The conditional probability of getting 1024g on swapping is 1/2. But might I not reason like this: I was very lucky to pick the second highest amount on my first draw, so maybe I shouldn't push my luck? And doesn't this argument get weaker, the less i find in my envelope? If so, doesn't this counteract the rising utility of swapping?
OBJECTION 7:
Nonsense, this is like the gamblers fallacy. The luck in getting 512 on the first draw is independent of the probability of getting 1024 on swapping.
MOVE 8. So the chance of getting 512 first is irrelevant when it comes to assessing the chance of getting 1024 on the second? Well in that case, how can I be so sure that the chance of getting 1024 on swapping is the same as the probability of getting 1024 on swapping conditional on getting 512 on the first draw? Either the first probability is relevant to the second or it isn't. You can't have it both ways.
OBJECTION: 8
Bah! You don't understand anything about probability.
1. This is just crazy and need no comment.
2. This is what others say too quite often. But is it true? I don't think so.
3. Same as 2.
4. Here you always fail to see the symmetry of the situation. You say that you should swap but not your twin and you even claim that this advice won't lead to a paradox(!). Well, the only problem is that your identical twin will reason in the same way insisting that he should swap but not you... So your very own mathematics leads to your very own paradox here. May I call it Jonny's paradox in honor of you?
5. ----
6. ----
7. Of course we ignore "initial luck" when we calculate conditional probabilities! This habit of yours to invent new crazy mathematics whenever you like without any justification whatsoever is very tiresome. And when you in addition to a all that newly invented mathematics claim that you don't care about other mathematics than your own I can't help but losing interest in your ideas. If you can't relate your ideas in any way to the body of knowledge mankind gathered so far no one will be interested in what you have to say. Ever. Period. I'm sorry to say that and I'm a bit shocked if this came as a surprise to you.
8. More nonsense.
Thanks for your continued interest Nick. I am not disregarding probability theory or making up mathematics when it suits me. (Though in this marathon blog I’ve probably made a few mistakes)
The issue at stake is whether the second axiom of probability theory is necessary in order to have justified degrees of belief. Lets call a justified degree of belief the “correct credence” I’ll assume Papineau’s definition of correct credence, which is the probability relative to the subject’s knowledge. The second axiom requires that the possibilities be numerated so that they can be normalised. In the two envelope paradox, the prior probabilities can’t be normalised because
1. You don’t know what the maximum possible amount in an envelope is.
2. You know that there is a maximum possible amount.
I will assume 2 because if you don’t assume there is a maximum amount then you must assume that there is a possibility of finding an infinite amount of money in the first envelope, which doesn’t make sense to me. Because of this, the probability of finding 2x on swapping given finding x is undefined, which is not the same as 0.5. With me?
So, in the ordinary two envelope paradox, if we respect probability theory, there is no problem because the 0.5 probabilities are just ill defined. My contribution is to say, that we can actually have a correct credence if we assume that indifference is the correct attitude to have. There is prima facie appeal to indifference, there has to be or there really is no paradox. There are also arguments for indifference which don’t involve any probability assignments, just ordinary logic. It is also possible to show that for any finite sequence of pairs of envelopes, a swapping tactic will always have the same expected utility as a keeping tactic. We then use Ramsey’s formula to discover that the correct credence is 2/3, 1/3, using straight forward arithmetic. This doesn’t involve ignoring a body of work on probability, quite the reverse. Ramsey was a mathematician and a philosopher and his work on probability is ground breaking and has stood the tests of peer review and time.
Now to discuss the specific problem that you think I still have. If you did know the range of possibilities, and if on normalising the prior distribution it was the case that you had a correct credence of 0.5 for getting the higher amount on swapping, it is perfectly possible that the person with the other envelope should also have a correct credence of 0.5 for getting the higher amount on swapping, so isn’t there still a paradox?
The answer is simply no. Here is why. Lets say I find x in my envelope. The expected utility for me given my correct credence of swapping is 1.25x, so I should swap. Now you ask me whether I shouldn’t also give the same advice to my twin, which appears to be paradoxical. I say no, because the expected utility for my twin of swapping is x relative to my knowledge, since I know that he will get x on swapping because he will get what is in my envelope on swapping. But the expected utility from his perspective of keeping given MY knowledge is 1.25x, since he has got a 0.5 chance of having 2x and a 0.5 chance of having 1/2x.
Now isn’t that odd, since if I were in his position I would swap given what I acknowledged the correct credence he should have given his knowledge? No, not really, since we are in different informational states. I know that if he has 2x he will have 0 probability of getting 4x, whereas he should assign a 0.5 probability to getting 4x, given what he knows. It is no more puzzling than the card game I described a few comments back. You seemed to ignore that, so I’ll repeat it more simply. Suppose me and my twin cut from a pack of cards. The prior probability of me getting the highest card is ½. (lets suppose that spades beats hearts beats diamonds beats clubs). Now let’s suppose I pick the queen of spades. I should now have a credence of 47/51 that I will have the highest card. Now, what advice should I give to my twin? Should I say that he should have a 3/51 credence in winning? No, because he doesn’t know what card I have. The credence he should have in winning is the number of cards remaining that are lower than his card divided by 51. This will be on average ½. So it may seem like I think that I have got a 47/51 chance of winning whereas he has only got a ½ chance of losing. But this really isn’t a paradox. Or do you think it is?
You still, I'm sorry to say, haven't understood the two envelope paradox. Your account of it is not correct which means that your reasoning based on your account leads you astray.
That you don't know what the maximum possible amount is in an envelope is not essential at all to the paradox. I have tried to show you that with an example. So your "1." is wrong.
It is not the case that the possible amount in an envelope has to be bounded (have a maximum). It is perfectly possible that we pick envelopes from a probability distribution that is unbounded. If you had studied mathematics you would be familiar with a lot of unbounded probability distributions. So your "2." is wrong too.
And if you had studied mathematics just a little you would never say something like "if we don't assume a bounded distribution we would possibly find an infinite amount of money in the first envelope." Your reasoning is cute, but not correct.
You admit that you previously made "a few mistakes" when it comes to mathematics. I must say that I strongly doubt that you realize all mistakes you did and still do. Could you please tell me what mistakes you've discovered that you made? And what would be the correct calculation in each case according to your current understanding?
You then speak of Ramseys formula and philosophy as if it would be universally accepted. It isn't. You can in fact be wrong EVEN if you are a mathematician. Is this news to you?
Then you have a long explanation why it isn't a paradox that you and your twin have opposing desires in life. Thiw would be based on different knowledge. Here again you fail to understand what this paradox is all about. There are distributions that whatever you and your twin will find in your envelopes you would like to swich envelopes, qaccording to the same kind of calculation you do here. But in that case you know you want to switch even before you look in an envelope. Most people consider that to be a paradoxical situation.
Please show me how Ramseys formula or by violating the second axiom of probability solves this situation. You haven't yet done that. But please try to avoid mathematical nonsense, if possible...
You failed to understand my last comment at all. And it appears you don't really understand conditional probability.
Your point seems to be that if the prior distribution was in fact such and such, then the probability of getting 2x conditional on picking an envelope with x in could be 1/2. I say, well then you should swap. You then say, but given the same distribution the probability of getting 4x conditional on picking an envelope with 2x is also 1/2. So a twin who has the second envelope should also swap. But this just is not a paradox. Why? because the probabilities are conditional on different events. What is the probability of you getting 4x on swapping conditional on your envelope containing 2x AND your twins envelope containing 4x? Answer: 1.
You have made no new points that I haven't already answered. I have managed to explain my solution to professional mathematicians, and I have tried to answer all your queries. I feel I've done my duty by you now, that you still don't understand can no longer be laid at my door.
As for this business about unbounded probability distributions, this does interest me, and you clearly know more about it than I do. But it is of scant relevance to the two envelope paradox. All that is needed is that you don't know how much is in the second envelope. If you want to tell a story about how this is in some sense conditional on your prior expectations for the amount you pick in the first envelope, then you may want to talk about the prior probability distribution for amounts in the first envelope. But now I have you in a dilemma. Either you have a limit for the maximum amount in an envelope, in which case, the expected utility of swapping is always going to be in general equal to the expected utility of keeping, and there is no need to worry that in particular cases in might be advisable to swap. Or there is no limit. What I want to say about when there is no limit, is that there is no sense in the prior expected utility of keeping the first envelope. You seem to hint that there can be sense made of this using well known mathematics. Since you have given no examples and won't explain what you mean, I can only guess. If the amounts in the first envelope were distributed normally for example, there would be no amount that was the highest, but the probability of getting higher and higher amounts would dwindle in inverse proportion. So you may have an extremely high probability of getting between £50 and £150, but an extremely low probability of getting above £1000. But the problem with this picture is that we are talking about expected utility. So the probability has to decrease faster than the increase in utility, or the expected utility ramains constant and is an infinite sum. If you don't understand this, then look at the St Petersburg paradox and it should become clear.
My suggestion is that the probability distribution of the total in a pair of envelopes should be plotted using a function of 1/x. This makes sense if you think that for every £20 in the world there are 2 £10s. I don't know the mathematics to be able to normalise this distribution on a continous scale, perhaps you could help me? Given this distribtuion, the probability of find a particular amount in an envelope is a function of 1/SQRx because each envelope can appear in two totals. With this distribtuion it is clear that the conditional probability of there being 2x in one envelope conditional on there being 1x in the other envelope is 1/3. Try it for yourself and see if you don't believe me. Show me how to normalise it if you want to help me. More likely just ignore this comment and reiterate some of your earlier points.
This is in fact most interesting. You are apparently mentally blocked from seeing the paradox here. That you and your twin will happily swap with each other because both of you will gain on the swap doesn't bother you at all. Amazing.
I have encountered people with mental blocks before, but concerning totally different issues. When talking about natural selection with deeply religious people for example, who believes that the world was created 6000 years ago. It doesn't matter how simple and precise you make your arguments and explanations of the subject, they are mentally blocked to see what you are talking about anyway. All evolutionary arguments are like water off a duck's back. The same with you and the two envelope paradox.
Another similarity you have with these people is your attitude towards knowledge in general. You have previously proudly declared that any knowledge of mathematics is totally useless for solving a problem in probability theory, yes knowledge can even be dangerous and damage your correct way of thinking(!). In your last post you reveal that not only are you ignorant when it comes to mathematics in general, you are also ignorant regarding the literature about the two envelope paradox that exist. You are totally unaware of the contributions by others in the very subject that you claim that you have the only correct solution! This is truly amazing. Why you call yourself philosopher while you have this anti-intellectual attitude is beyond me.
However, I agree with you on a couple of points. One is the observation that this discussion doesn't move forward at all and that it's time to stop. It has been a very interesting experience for me at least, so I have to thank you for your time. I wish you good luck in the future and it warms the cockles of my heart that you found a mathematician that is kind to you.
The second point where I agree with you is your first sentence in your post above. When we agree this much I think we have a nice opportunity to put an end here. Bye and good luck once more.
Best Wishes, Nick
I'll put another post up. But that is just daft. According to you any poker game is a paradox. Any theory of poker will advise two players with higher than average hands to bet. I invited you to display your probability expertise in a useful way and you declined. You have just simply not read my comments.
Just for anyone who has read through this, what Nick has done here is to use an "Ad Hominem" argument. Having failed to show that there can be any paradoxical two envelope situation where the probability of doubling the money in the first envelope is 0.5, he instead focusses on my personal credentials. These ad hominem arguments get progressively more absurd ending with the claim that I have read no literature in the area, dispise probability, am anti intellectual and am a creationist. All these allegations are false and unjustified. The substance of these slanderous attacks seems to be that I do not see the paradox in the two envelope paradox. I do. The paradox is this: You are given a choice of two identical envelopes. You choose one at random. One contains twice the amount of money as the other. It is reasonable to suppose that the probability that you choose the envelope with the lower amount is 0.5. Since if you chose the lower amount and it revealed itself to be X, then it seems that you have a 0.5 probability of gaining X on swapping and in addition a 0.5 of losing 1/2x on swapping. This means that the over all expected utility of swapping is 1.25x so you should swap. Paradoxically, the identical reasoning applies to a twin who picked the other envelope. This is a paradox. My solution is to challenge the probability assignments of 0.5 to doubling X and 0.5 to halving X on swapping. This is hard to do and involves keen attention to the interpretation of probability in use. But it is a start to recognise that the twin is not in an identical epistemic situation and is using a different conditional probability. Whatever the amount he finds in his envelope, it is not X. It must be as far as you know 1/2x or 2x, but you don't know which. Lets call it Y. You expect to get 1.25 X on swapping, your twin expects to get 1.25 Y. These calculations are conditional on different events and have different conclusions. The superficial similarity confuses matters, as does not specifying the real quantities involved but replacing them with variables.
In a series of envelopes containing 1,2 2,4 4,8 and 8,16. Then the best strategy in the long run is to swap whenever you get 1,2 4 or 8 and keep whenever you get 16. This has the consequence that you may have 4 and you know that from your twins point of view it will seem advantageous to swap, whereas it is also advantageous from your own point of view. But this is not paradoxical, since it only seems advantageous from your twins point of view, because if your twin has 8 it will seem as though there is a probability of 1/2 that he will get 16 on swapping. But from your point of view this is impossible. So there is no paradox in saying that it is an advantage from each perspective to swap. This reasoning seems to be too difficult for Nick to understand, but I doubt this is due to a lack of intelligence, it is just down to the fixity of his ideas, and a presumption that he already knows all there is to know.
Wouldnt ML principle suggest that P(X=½x)>P(X=2X) ?
Yes, this is a kind of way of rationalising the solution I've played with. You find £10 in your envelope. You assume therefore that £10 was the most likely sum to find in your envelope (ML principle). You translate this into the assumption that the mean amount in envelopes is £10. Take any distribution you like of pairs of envelopes (x,2x) and plot them out with £10 as the mean and you find that P(y = £5)> P(y = £20). Further reasoning shows that this ratio tends to 2 : 1. However my clumsy grasp of mathematical conventions forces me to leave the intelligent reader to work this out for themselves.
Jonny
Actually the reasoning is quite straight forward. Only the {10,20} and the {10,5} pairs are relevant. To make the mean envelope = 10, there has to be two {10,5} envelopes for every {10,20}.
Yes the roughest estimate:
if 10 is from discreet uniform(5,10) is p(x=10)=1/5 and if 10 is from duniform(10,20) p(x=10)=1/10, so the ratio is 1/2.
and this doesn't depend how you got the value 10.
In bayesian thinkinq the problem lies in how you will get the value 10 (apriori), and then you throw a coin to get left or right.
It really is impossible to select a value(10 in this realization) in R+ in democratic way (f.ex. you don't have a uniform(0,infinity) because th dF must decrease heavily when going to infinity (otherwise F would diverge) .
Bayesian strategy thinks you have a fortune wheel [0,1]which give y from uni(0,1) and then an increasing propability function F that gives (10 in this example=) F^-1(y)
In Bayesian method you can choose F
"almost freely" = subjectivity.
In ML you estimate the distribution, which maximizes the probability p(x|Theta) , Theta from possible distributions. Again Theta family is quite free, Still this approach is LOCAL.
Yes yo dont really care a damn where yhe 10 comes => No Bayesian.
I try to explain why probabilitors fail in this case.
Think the function that tells the probabilities in the point X=x (x=10 in this case).
The left limit would be higher (=2/3) and the right lower (1/3), a totally discontinuous!, sure a contradiction in Bayesian bible. This can not be fullfilled in Bayesian theory, because here it is a question of different kind of distributions (cf. diracs delta).
A Bayesian is bound to his mathematical frame and he cant jump outside. It would be like discarding the Intelligent Design...
Think a probability distribution which would have different left and right derivatives everywhere. It will not happen in a Bayesian landscape.
In fact it does not depend on the values 2x and x/2. The only correct strategy is: you dont need to switch the envelopes. The expected value is always x. For those = population of the earth -2 it can be taken as an axiom.
Thankyou anonymous whoever you are.
I'm glad you mention the Dirac Delta. It is my belief that degrees of belief are 2 dimensional, the second dimension being certainty, or strength or weight of knowledge, which can only be given a utility measure, not a probability measure. If we have made a series of n observations all of which have been Fa and none of which have been F~a, then we get a Dirac delta distribution. The question is what are the odds on a specific F observation being an Fa observation given these observations. The obvious answer is that it depends on n, the higher n the greater p(Fa)/p(F~a). But why should p(F~a) > 0? The only answer is because otherwise the maths don't work. This is a kind of phobia of certainty that plagues Bayesians. But on a Dirac Delta distribution of probability we can see that the idea that as n increases, the probability that p(Fa) is higher for an arbitrarily small interval around the mean collapses, since the probability is 1 for all values of n and for any interval however small. So our strengh of belief in hypotheses based on uniform data cannot be measure as a second order probabilility function. In my view, in can only be measured by a pure utility function, which is roughly the amount you risk on Fa before you prefer to fold your money and put it back in your pocket. This will be a direct function of n, simply the cost of n observations of F(av~a) = the degree of certainty that Fa will happen once.
I must take back a little:
There is a flaw in Wiki, in that sense that indistinguishable balls with other $1 billion inside and other $2billion inside are not indistinguishable balls in the sense of indistinguishable balls.
So there must be yet another ½ in the expectation calculation and it becomes ½*5/4A (wiki)< A that youre holding already in your hand. So you never swap.
Wiki game is a game with two pushbuttons with tapes on them: pressing a button throws printed sum in your bucket.
Compare it to the game where there are two buttons, other without tape and text "$10 billion" on it and other with tape on the unknown text.
The original game can be constructed with Probability function F , F^-1 and randomgenerator uniform(0,1), what comes to value $a seen in the envelope + a lottery for ½ or 2. (Lottering) and (generation of $a) are independent operations, and thats one reason why it is useless to go to Bayes.
$a wont have an effect to the Lottery.
I have same opinion as you that minimum lottery would be between ½,½,2 in order to be democratic and
NATURAL (nature puts the double weight for ½ (maybe there wouldn be any gravity if not)).
Still I think that some idiot would and could use a coin.
Isnt it funny that you can manipulate this game as you will, and then there are few millions of Bayesians who have lost their balls?
h.m.
I want to make it clear that this maximum liklihood rationalization is only an after thought. You seem to be suggesting that if you randomly generated a number between 1 and N and then tossed a coin to determine whether to half or double this amount and created two envelopes this way, then someone who picked one envelope would have a 50/50 chance of doubling their money by swapping. But this is not the case, the probability would still be 1/3 for doubling your money. How can this be so? Think it through. there are 2n possible pairs. {1/2i, i} and {i,2i} for all i from 1 to n . This makes a total of 4N envelopes. The amounts in the envelopes will vary from 1/2i to 2N. From N - 2N there will only be even numbers. From 1/2N - N there will be odd and even numbers and from 0 - 1/2N there will be odd numbers, even numbers and 1/2 values. So you are more likely to draw under 1/2N and less likely to draw over N. Now should you swap on inspection? Well, clearly if you knew the distribution and the value of N you should swap in all cases under N and keep in all cases over N. But let us suppose you don't know the unit or the value for N. If you went through the whole circuit swapping every time, you would end up with exactly the same money as you would if you keep the envelope every time. So the chance of doubling your money on swapping must be 1/3 and the chance of halving your money must be 2/3. Because we are interested in utility rather than frequency, we must count the times when we get over N on our first draw as more significant than the times when we get less than 1/2n. So though the frequency of incidence where we double our money on swapping is 1:1, the probability of doubling our money is 1/3. Once we've got this concept of the unit i and N, we can set N as small as we like. Let us set n at 1. There are then two pairs of envelope (1/2N,N) and (N, 2N). If we don't know what n is, but we can set a machine to keep or swap on a large number of trials, we know that we will get the same money which ever we set it to. The swapping strategy will win 1/2n 1/4 of the time, n 1/2 of the time and 2n 1/4 of the time. The keeping strategy will get the same. the expected utility of both is 1.25n. So although swapping is the better strategy for 2 out of 3 cases, the keeping strategy performs equally well over all. We know this is true for any value for n such that we do not know in advance what the value of n is. Looked at in utility terms there is no distribution that does not fit the 1/3, 2/3 model, provided we have no information about how the envelopes are filled.
What I really meant is that all we need to analyze this problem is the quantities ½,1 and 2 . So that the way you got to this point, is irrelevant.
The way reals are distributed,can be analyzed locally without going to Bayesian.
Utility is (in my opinion) quite correct way and also the way I chose: in this case it is solvable, but in Bayesian case not.
So if you in long run will get something else than p½=2/3 and p2=1/3, you should shoot the casher.
What a relief, someone who understands at last!
Also I'm interested in your reference to "Natural". I believe utility to be a measure of value, which like time, is an objective feature of the universe, but which is relative. If you drop the axiom of unit measure, the second axiom, then as the change in utility from a perspective increases, the probability goes down. Of course time is involved as well, since the greater the time period, the more the utility can change, so the probability of any specific utility change is greater the larger the time period.
If we insist on the second axiom this effect will disappear. The unit probability space is given by the maximum change in utility. If the change in utility itself is the variable, then the probability space is undefined. The kind of natural example is easier to find in biology than in physics, and yet easier still in economics. The value of a planted apple seed in 2002 will be a fraction of the value of an apple tree in 2008. The fraction will increase or decrease over time, terminating at 1 or 0 as 2008 approaches. This presupposes that apples have intrinsic value. The value of an apple is natural since it is food, a form of energy of particular value to human beings. Units of exchange are often indexed to food, oil and labour. The more food stuffs available, the greater the population, the greater the working population, the greater the productivity, and world economy is directly related to oil prices since the use of energy increases the productivity of labour.
The relationship of value to time and energy makes it seem very close, very close indeed to physics. Ramsey and Turing use the concept of "weight of evidence", and "weight of knowledge". Economists since the sixties have used the concept of variance and risk, a non probabilisitic quantification of uncertainty. Over specialisation and the sparsity of the generalist, mixed with a kind of cowardice in philosophy has prevented this all being tied together. But the threads are there. Einstein's general theory, Ramsey's utility measure and probability measure, the decision theory paradoxes, the heuristics and bias's results, the swift rise of the mathematics of options and futures markets, esp. Black and Scholes, the old problems of scepticism, Gettier, and induction in the analysis pf knowledge in epistemology and the recognition that "S knows p" is interest relative. It all points to what Plato told us in the republic: the good is the light by which we understand the truth.
It seems, that two envelopes problem has common with quantum measuring.
After you have seen, that first envelope contains value 'a', you get the probable "realizations" f½ and f2. True realization comes out when you open the second letter. Now if you think, that nature puts the "metric" to the system, as in our case ½ in left side and 1 in right side you get the "right" or "fair" probabilities 2/3 and 1/3.
This might put deeper understanding in the "Schrödinger Cat" and "Quantum contra Classical measuring " type problems. Metric is not bound to [0,1], like probability is, neither is utility, if I remember correct... Something like:
u=k/d the greater the utility the shorter the distance.
This is the exciting end. Talk of "realization" can be directly carried over into value. When we "realise" the second envelope by opening it, we realise the value, as in it become part of our holdings. Expected utility is unrealised since expected utility is not a part of our holdings. But if you had a bond that you were bound to sell at time t, then the value of that bond would be a function of the distribution of prices at time t. You could see this as a probability curve. The value of the bond would be the mean of this curve minus the risk, which would be some function of the shape of the curve. (eg if it was a normal dist. you could give the standard deviation). If you *knew* the price of the bond at time t, then this price would have probability 1 before time t. In this case the bond's value would be *realised* and could be treated as cash even though t is still in the future.
In the same way, (though my physics is rudimentary), when a probability wave function collapses, the position of the particle is *realised*, in other words becomes fact. If however, some advance in physics could predict the position of the particle before hand, the position would be realised as soon as its probability reached 1.
What does all this mean for Shrodingers Cat? If the theory predicts P(cat dead) = 1/2, P(cat alive) = 1/2 then the utility of the situation relative to a market in which the theory was common knowledge would be U(shrodingers cat) = 1/2 {U(cat dead) - U(cat alive)}. This might seem anti realist, but only to those who are anti realist about value. But how can anyone be anti realist about the difference in value between a dead cat and a living cat?!?! The difference is interest relative admittedly. But if we already accept that time is relative, this shouldn't worry us. The paradoxicality of Shrodingers cat comes from the fact that we feel intuitively that U(cat alive) - U(cat dead) is very large for the cat. I guess I'm with Einstein on this one and just assert that the cat would know, therefore P(cat alive) = 1. (or 0 in the bad case for the cat).
If we had a shipment of N Shrodingers cat boxes to be opened, or *realised* at time t1, then the value of the shipment would be 1/2N(U cat alive) + 1/2N (U cat dead). I can't quite think it through at the moment, but it can't be too difficult to change this into an envelope paradox.
I've done some work on this and I'm wondering if anyone is still here and if it's worth still continuing this, or if this discussion is carried on somewhere else.
At the outset though I can say I am impressed with Beth's answers, I believe she has the right answer. I actually came up with the same solution on another forum a while ago and shouted down too.
The fact that this might lead on to solving the double split experiment eventually is quite exciting.
Hello Mikey,
I am intrigued by your comment. I am not sure what you mean by the double split experiment, but I am very interested. I don't know much about physics but I did wonder about some physical counterpart to the two envelope paradox. I was thinking that the utility variable could be replaced by energy, distance, time, mass or velocity. If you would like to post a short explanation on Blogginthequestion send me about 500 words to jonnyblamey@yahoo.com and your real name. Then I'll post it up with your name and date at the top, which should suffice for copyright purposes. Essentially my mature view is that probability theory became "risk blind" in the 1930s. But magnitude is as important as ratio. Conditional probabilities should be given a mass, or utility index. The probability across different magnitudes will be inversely proportional to the utilities. P (H | E)U1 will be twice p (H | E) U2.
Hi Jonny. Thanks for the reply. Wow, there is still someone here.
I'm not that far advanced as to actually put physical quantities to utilities.
The Young double split experiment is something amazing, basically if you peek at which slit a particle went through (originally light particles but now proved to be any particles), the pattern on the back wall changes from an interference pattern to a simple scatter pattern. It's been reproduced many times. I just can't help thinking that maybe it's like peeking at the contents of the envelope and all of a sudden the probabilities change.
http://www.youtube.com/watch?v=DfPeprQ7oGc&feature=related
http://www4.ncsu.edu/unity/lockers/users/f/felder/public/kenny/papers/quantum.html
Anyway, I'm just conjecturing at this early time, but it would be fun to explore it more. Probabilities are involved because they theorise that waves give a probability pattern to try and explain the interference pattern.
Also, I am a computer programmer and a gambler and I worked out the answer to the 2 envelopes problem via studying gambling problems and just got a feel for money over time. I got quite comfortable with the fact that you can't make money out of nothing and the probability of swapping to a larger envelope must be 1/3.
I will keep in touch.
OK, firstly I’m going to show that somebody actually put the effort in and read the first few pages…
ALF said…
“If I elect to KEEP, I will end up with x for certain, but if I SWAP I will end up with either 2x or 1/2 x. Therefore it is rational to SWAP since I’ve ½ chance of ending up with ½ x and a ½ chance of ending up with 2x, making the expected utility of SWAP 1¼x. Therefore I should SWAP, and what is more I should pay up to ¼ x to SWAP.
“
In response, BETH said…
“That’s absurd since then you should also pay up to ¼ x to swap back on the same reasoning. It is absurd because you end up with an intransitive preference. You have reasoned that you should prefer E1 to E2 and E2 to E1.Your mistake is in assigning a single value to x. E1 contains a sum of money and you have elected to call that sum x. If we call the higher sum “A” and the lower sum “B”, then p(x = A) = ½ and p(x = B) = ½ . So we can calculate the utility of KEEP as being ½ A + ½ B, which is ¾ A or 1 ½ B. We don’t know which x is, but we know that x is either A or B.
“
All ok up to here.
Next we have
“So this gives us 2 possibilities.
1. p (E1 = A and x = A). = ½
The Pay off for KEEP is (A = ½ B = x)
The Pay off for SWAP (½ A= B = 1/2x)
“
All ok here. I know that 1/2x means (1/2)x or x/2, but all is ok here.
Next we have
“2. p (E1 = B and x =B). = ½
The Pay off for KEEP is (½ A= B = x)
The Pay off for SWAP (A = ½ B = 2x)
“
However, this is not correct. A is never equal to ½ B, it’s always equal to 2B.
So it should say…
The Pay off for SWAP (A = 2B = 2x)
The mistake was transferred below…
“And the total pay off for SWAP is:
½ (½ A= B ) + ½ (A = ½ B ) = (3/4 A = 1 ½ B)
“
It should say…
And the total pay off for SWAP is:
½ (½ A= B ) + ½ (A = 2B ) = (3/4 A = 1 ½ B)
Strangely though, the result in that line, being (3/4 A = 1 ½ B), was correct.
ALF’s next two words was “Very clever”. Indeed it was very clever to have wrong working with the correct answer. Unfortunately for Alf, he didn’t seem clever enough to actually read and check the work.
As I said before though, BETH’s overall solution is in my mind the correct answer; the probability of swapping to a higher amount must be 1/3.
So now you know I actually read this stuff. I only read where I think the writer was sane though. I didn’t read too much of that clown that thought that ½ of x is 1/x though, if they show they are on drugs I stop reading.
Good evening.
Thanks again Mikey, its quite incredible that no one noticed that mistake before considering how desperate some commenters seemed to be to put me down. I think it shows that there is a gap between what I think I am saying and what I actually am saying. I was trying to keep three variables seperate "the amount you find your envelope" E1; "the amount in the highest envelope" A and "one third of the total amount in both envelopes" x,
I think in the passage you spotted I got "A" temporarily mixed up with "E1".
Anyway, I've since abandoned trying to label variables since it is difficult to write and even more difficult to read. I very much appreciate you making the effort.
I've decided that the real issue is to do with whether you count transactions as descrete packets and take this to be the equiprobable unit, or take transactions to be quantities of loss or gain in some continuous utility medium and take a measure of utility to be the equiprobable unit.
So if you are interested in gambling you can get a feel for this by taking the difference between a million £1 bets on some half chance repeatable event (coin toss) and a single £1 000 000 bet on a single coin toss. The expected utility is the same in both cases, but the variance is much higher in the second case.
Likewise, if you placed a series of £1 bets and £2 bets (at even odds), and you lost half the bets and won half the bets, then your expected utility would be £1.5 per bet assuming that the wins and losses were evenly distributed across the bets.
But if you lost all the £2 bets and won all the £1 bets, then the expected utility would only be £1.
I find this intuitively obvious, but hard to state formally without going back to Ramsey's probability theory. The probability of winning any bet selected at random is 1/2 but the probability of winning in terms of continuous utility is 1/3. This is because for each £1, £2 pair you lose 2/3 of the money.
From this we can generalise so that as long as every pair of bets came in a (x, 2x) pair where "x" is a completely free variable, then the same probability will hold. (1/2 for discrete wins : losses, 1/3 for continuous gains : losses)
So to the two slot experiment: the analogy is that we have two ways of looking at the quantaties. 1. as descrete particles, 2. As a continuous substance.
Just as a grope in the dark since I don't know what the quantities are, when just one slot is open, this is equivalent to KEEP, and the expected utility (position) is identical to the actual utility. But when two slots are opened, although the expected utility remains the same, there is a degree of variance due to uncertainty, so the outcome is a wave rather than a discrete package.
It seems like I didn't get that 2nd link I wanted to put up properly before. Here it is...
http://www4.ncsu.edu/unity/lockers/users/f/felder/public/kenny/papers/quantum.html
It still got truncated, lets try again...
http://www4.ncsu.edu/unity/lockers/users/f/felder/
public/kenny/papers/quantum.html
You'll need to remove the newline before public. I don't know why it won't normally show the whole link at once, very strange.
"Anyway, I've since abandoned trying to label variables since it is difficult to write and even more difficult to read."
I think you did an excellent job creating the variables E1, E2, A, B and x. You need to define variables clearly to be able to use them and you did.
I've seen a lot of people who call themselves mathematicians who create flimsily defined variables and come out with outlandish solutions, but you haven't done that.
The two envelopes problem isn't anywhere near done yet, there still are many things to solve, and one of the most interesting things is the side bet on whether one will swap to a higher or lower envelope.
Unfortunately I have a mentally taxing job so I don't do too much outside of that so moving forward with me will take some time, but that doesn't mean I won't do it, I will, just slowly when I have some brain power outside of work.
I can tell a story though about when I sold papers at the races on Saturdays 30 years ago for pocket money.
At the end of the day we would go to a room and count our takings and I had counted my 5c pieces and Willy (the newsagent) came past and said "how much have you made today roughly". I said "well I have $3 in 1c or 300 1c pieces, so I guess I will probably have 300 2c pieces, 300 5c, 300 10c and 300 20c for a total of $3 + $6 + $15 + $30 + $60 for a total of roughly $114."
Willy said..."nah, if you have $3 in 1c then you will have $3 also in each of 2c, 5c, 10c and 20c coins, so you have roughly $15."
Willy had estimated that from experience, he had done the banking every week and knew from experience when it had been my first time.
Being young I had actually argued with him at the start and told him it couldn't be; but when all added up it turned out that he was right.
It was only when the two envelopes problem was presented to me and I too solved it as being 1/3 probability of getting the higher amount, that it hit me that the solution was actually the proof to why I had 1/2 the 2c coins that I had in 1c coins. The 2c coins are twice as valuable and therefore occur 1/2 as often.
And even today where the lowest coin in circulation is 5c, in my change jar, there are twice as many 5c coins as there are 10c coins, going right up the line to 50c. I don't count the $1 coins and above because I raid the jar for them and the count for those will be inaccurate.
Until this two envelopes problem showed up, I had no proof as to why this phenomena was like it is.
Good evening.
Well that's very pleasing Mikey. When I started writing a paper with my solution for a journal I was trying to find empirical examples of this principle that the frequency is going to be inversely proportional to the utility. Your coins example is perfect. It is better though to use non human examples, and I thought of Bio Mass. You would need a third parameter of fitness. The fitness of a species would determine its total biomass. Then the frequency of the species would depend on the bio mass of the individual. So there are many more humans than Whales, Many more Rats than humans, many more Cock Roaches than Rats, Many more flies than cock roaches, many more bacteria than flies and so on.
Once you accept this frequency principle, which is kind of a priori, then the fact that the probability of two envelopes containing a sum of 3x is twice the probability of the sum containing 6x is straightforward. Essentially it is because there are two 3x units for every 6x unit.
The Side bet issue is interesting. The side bet would do well at odds better than 1 : 1, unless the bet stake size was indexed to the amount in the first envelope, in which case the bettor should not bet at odds worse than 1 : 2. The problem is that since the 1930s, the stake size has dissappeared from probability theory. If you accept the principle we have been talking about, this is a massive mistake, since it means that probability theory can't deal with real situations unless the stake is kept arbitrarily fixed.
I think the Bio Mass example may be a bit complicated and you should try and keep it simple. You are just out to show real life examples of rarity vs value.
It shouldn't be too hard to find other examples of frequency vs value. Stamps and coins come to mind quickly, but out of season fruit or old magneto telephones could be used.
Money basically is everywhere so it's hard to bypass it.
You know, I suspect that if they only put the same number of 5c coins in circulations as 10c coins, I suspect that the 5c coins would become worth 10c as vendors would find them hard to come by and they would keep running out of them when giving change. They would then hoard them and pay as much attention to keeping a 5c coin as they would a 10c coin.
Like I said the two envelopes problem is far from finished with.
Sometimes it still doesn't sound right that one is more likely to open the highest value the first time.
Side bets are also a function of who is offering the bet. If the game show host offers the odds of even money that you will open the highest envelope first, I suspect it's different to an audience member offering the bet.
Should you take slightly less than even money from the host before you open the envelope that you will open the highest envelope first? If you are likely to pick the higher envelope first at 2:1 then you would take the bet, however, I suspect you shouldn't take this bet.
Yet after you open the envelope if someone from the audience offers then bet then you should probably take the bet.
You'll need to explain a bit more about the bet stake size being indexed to the amount in the envelope. I sort of see what you are getting at, but it would need to be layed out more for me to understand.
"Sometimes it still doesn't sound right that one is more likely to open the highest value the first time."
Yes, this isn't right. This is where the side bet issue comes in too. You are equally likely to pick the A envelope as the B envelope. So the probability of picking the highest amount straight away is 1/2. Therefore the probability of getting the highest amount when you swap back is also 1/2. This is what makes the paradox seem unsolvable.
But the point is that when you pick the lowest amount, you are betting at a lower stake size than when you pick the higher amount. So the bets you lose are going to be at twice the stake as the bets you win. This is true whatever distribution of envelopes comes your way.
To make this clear, suppose you had two pairs of envelopes: (£1 000 000, £2 000 000) and (£1, £2). There were equal numbers of each pair, so the probability of getting (£1 000 000, £2 000 000) is equal to getting (£1, £2). In this case it is clear that you should be indifferent between swapping and keeping, even though the probability of getting the higher amounts first is 1/2. In order to make this game work, we have to do ballistic swapping. So you have to say that you will swap before you look in the envelope, otherwise you will only swap when you get £1 000 000 or £1. In the ballistic case, you have to calculate the odds of doubling your money as 1/3. Why? because for every £1 000 000 you win through doubling, you lose £1 000 000 through halving. So the halving has twice the magnitude of the doubling. Basically if you pay the amount you find in the first envelope as your bet, then you will lose twice as much as you will win at even odds. But someone placing a side bet can place the same stakes on the winning bets as the losing bets, so this magnitude effect dissappears and the probability becomes 1/2 again.
I'm afraid this is the clearest I can make it. I think it is intrinsically difficult, so it is just best to let Ramsey's formula do the work. Unfortunately, it is Kolmogorov's axioms that mathematicians use, not Ramsey's. So mathematicians can't solve it.
Thanks for the above discussion, including the interesting idea of the 2/3 probability. There is a similar discussion here:
http://schwitzsplinters.blogspot.com/2007/05/two-envelope-paradox.html
I don't think you are more likely to get the higher value the first time... rather, if you get a very high value, then the cost of switching down is very high and the chances of switching up are not so good. I don't see why you throw out the traditional notion notion of forming a nonzero prior distribution first...
One solution which is simple and does rather heavily lean on a nonzero prior is this: Switching when you find a very low value is obviously a good plan. But if you see a value so large that it surprises you, then it might be a bad idea to switch. If the amount X in the envelope surprises you because it is very large... that's because you guess than the average envelope contains much less. The fact that you saw X is consistent with one of the following:
> The envelopes are better than I thought, and the prizes are X and X/2.
> The envelopes are much better than I thought, and the prizes are X and 2X.
If your "surprise curve" says that odds of those two conditions are, say 9:4 in favor of the first case, then you should not switch, as the costs of switching down exceed the benefits of switching up. Now, suppose you are surprised to see $3000, but you figure "I would be more surprised to see twice as much, but not twice as surprised," then you are assuming that the distribution of
"what the value could be" is very flat. But, in fact, that distribution is so flat that it doesn't have a finite integral. That is: the distribution of "what a value could be" has to be
rather tight in order to be integrable. You believe in integrability if you believe "the probability that the prize is between 1 and a million coins is greater than 0.00001" or any
similar statement, with the parameters "1", "a million" and "0.00001" replaced by anything.
I think that this paradox is compelling because you should definitely switch from a median value. This is balanced in the expected-value calculation by the great sums you would lose by switching from a high value.
This comment has been removed by the author.
Thanks Odatafan,
I have carried on developing my stake size variation principle into a comprehensive thesis since this blog post. I believe my solution still holds.
The solution, to summarize, is that the probability of doublig your money is 1/3, whereas the probablity of halving your money is 2/3 if you decide to swap.
You say that you don't think it is more likely that you get the higher figure first time. Presumably you think this is a onsequence of the above. But it does not follow. The first time around you have if anything a virtually 0 probability of getting any rigid designated sum of money. This is because you don't know how much money is in either envelope. However, to get "the higher sum" has a probability 1\2. This is why you should remain indifferent after you open the envelope. Your discussion of how suprised you would be by the sum of money in the envelope is interesting, but I fear it just imports information from outside the original problem. The same paradox would arise if you used units that weren't translated into real money until after the decision had been made. This would screen out any background knowledge about what sums of money are likely to be packed into the envelopes.
Jonny Blamey
Hi John,
It's a nice idea to use "abstract numbers" in the envelopes; those numbers to be translated into real money later, after the decision is made. Touché.
You're right, that I looked at "...a subject who is indifferent between the options KEEP and SWAP has a degree of belief 1/3 that E2 contains twice the amount in E1 and degree of belief 2/3 that E2 contains half the amount in E1," and imagined that subject saying: "I 1/3-believe that I have the lower value and 2/3 believe that I have the higher value" and reasoned to "I probably have the higher value." You're quite right that I think that one follows from another... if the probability of doublig your money is 1/3, then the probability that you have the higher value is 1/3. Why would it not follow?
I am glad you asked, because it is easy to forget that we are talking about epistemic probabilities. The short answer is that the probabilities change when you open the envelope and view the money inside. We can see the truth in this if we imagine not opening the envelope. In this case, it is clear that there is no argument for swapping back since you are in no better position than you were before.
The confusion arises because the reasonable assignment of 1/2 to choosing either envelope, turns into the different assignment that the envelope reamining has an equal chance of having twice the money as the envelope you choose and an equal chance of having half the amount. Now we have three values. 1/2x, x and 2x. When you made the original choice, of these 3 values it was necessary that you choose the middle value.
I don't feel I am being very clear. But it is not a paradox for nothing! A way to look at it is that we sometimes refer to the values in the envelopes as variables, and sometimes as specific amounts. So for example, suppose the envelopes contained £10 and £20. If you choose the envelope with £10 and swap, then the probability that you will get £20 is 1 and £5 is 0.
OK.. the idea that --before you look, the probabilities P-pre of switching up or down is equal; but after you look, the probabilities P-post are not the same-- seems plausible, but it seems to me to be an extra assumption , which might translate into a restriction on the distribution of the envelopes. Yet, I would calculate the P-pre as the integral over all x of the conditional probability P-post, given x. The expected value can be integrated the same way:
The expected value of switching before looking is the sum (or integral), over all values x that you could see in the envelope, of the expected value of switching after seeing x.
I wonder what assumption I'm making about the distribution of a single envelope.
Maybe I should expand on that. I judge it to be consistent that:
A0. you should be indifferent to switching blind (before seeing one envelope), and
A1. There is a notion of "small" and "large" for which you should switch from low values, and you should not switch from high values.
your argument replaces A1 with
A2: No matter what x you see in the envelope, you should still be indifferent to switching.
I find A2 and A1 inconsistent, since A1 says "sometimes switch" and A2 says "be always indifferent." I believe that A0+A2 is consistent. And if it is consistent, then A0 is consistent. So you are right that this "solves" the paradox of A0.
But I think that in your post you have more or less assumed A2 and then you go about finding the implications of A0+A2. You are quite right to explore whether A0+A2 would lead to a contradiction.
Reading your blog has led me to see this problem more abstractly, and I'm sure that this is the more correct way to see it; extra information can "resolve" it too easily.
One question I should ask: what are the values of the bills you allow to appear in the envelopes? Reals, natural numbers, 2^z for any integer z?
Hi Odatafan. Hmm yes, the envelope paradox is a bit of a whirlpool, each revolution sucking me further in.
I tried playing with my neices, but it doesn't really work using real money because the only way to trick the chooser is to put in much more money than they expect.
The problem with your iedea of swapping low and keeping high is that it is hard to say where the boundary is. However high the amount, double that amount is not really that much higher, whereas half that amount is no really that much lower. SO for example, if you found £1 000 000, you might think this is a high number and keep. But is £500 000 more likely than £2 000 000? Not really! And if you don't swap you are losing a chance to get £2 000 000 at great odds. (If you believe the chances are 50%).
SO on distributions in one envelope. Think of an infinitely divisable currency. Think of all the values between 1 and 0.Now, every point has a lower partner within the interval, meaning that you can half every value in the interval and still remain in the interval. But only half the values have an upper partner. Therefore if you select a number at random from this interval and then, knowing only that the other value is half or double and is within the interval, assign a probability to double or half, then you should come out with 1/3 to double and 2/3 to half, since half the value will be fifty fifty and the other half will be lower. Since any bounded sum will fit into some interval, then this works for any range, apart from the case if you allow infinite wealth. A similar argument works in the discrete case.
Again poorly expressed. I wrote this in a paper, and it helps if you draw two lines, one twice the length of the other, and imagine all the possible pairs. The lower region is denser than the upper region.
golden goose
lebron 10
converse outlet
adidas tubular
yeezy boost
fila
calvin klein underwear
coach bags
fila
nike max
Post a Comment
<< Home