Researchers from Australia have taken a step toward resolving a seemingly simple yet unsolved paradox known as the "two-envelope" problem. They’ve worked out a new strategy that can enable a player to beat the game in terms of increasing their payoff. The strategy could have applications in optimizing gains in investments and other areas...The researchers explained that the strategy emerges from recent advances in two-state switching phenomena that are emerging in the fields of physics, engineering, and economics.
Hmmm. Everyone can be make better off in a two-person, zero sum exchange? Please tell. I think the paradox comes from not looking at the true state space. It is funny to think these guys think they figured out how to get blood out of stone via 'emerging fields of physics, engineering and economics'. Asserting the solution they do is like asserting a mixed strategy is dominant for the Monty Hall problem. I think they get results via jerry-rigged simulations with particular a priori payoff distributions.
The problem is simple to state:
In the two-envelope paradox, a player must choose between two envelopes, one of which contains twice as much money as the other. The player can open the envelope they choose, and then they have the option of switching envelopes. The other envelope, of course, has either twice the money or half the money as the first envelope, but the player does not know which
It may seem that given an amount $X in your envelope, switching generates 0.5*2*X+0.5*0.5*X=1.25*X, which is 0.25 times more than the $X you have. But if they both apply such logic, how can that be rational, they both have the same expected, positive return in a zero-sum game (one's gain is the other's loss)?
There are many known solutions to this problem, some of which are inconsistent, but I think the simplest is as follows.
You might think you have $X in your envelope, and can trade to $2X or $0.5*X, giving you a gain of New Envelope minus value of Current Envelope, or 0.5*($2X+$0.5X)-$X, or +$0.25X. However, you must consider, symmetrically, you are the 'other guy', and so your envelope contains the $2X or the $0.5X. In that case, your expected gain is the opposite, trading gives you a new envelope with $X, giving up the random 0.5*($2X+$0.5X), generating an expected loss of $X-0.5*($2X+$0.5X), or -$0.25X. As you are either in the first or second case with equal probability, the expected value is zero, 0.5*(+$0.25X-$0.25*X).
The seeming paradox comes from thinking you have the envelop with the $X that is either doubled or halved, ignoring the chance you could be the one who starts out with $2X or $0.5X. If one player has $X, you have a 50% chance of being that guy, a 25% chance of having 0.5*X and a 25% chance of having the 2X. When you look in your envelope, and see $10, you can't assume X=$10. It could be that X=$20 or $5, conditional upon seeing $10 in your envelope. Weighting everything probabilistically leads to no increase in value from exchange.
12 comments:
interesting comment and an interesting find. as the original piece points out, this is puzzle can be viewed as another version of "volatility pumping".
volatility pumping seems to be a favorite homework assignment for cover, luenberger, fernholz, etc...it's a focus on the difference between arithmetic returns and geometric returns often augmented with some references to entropy to fancy up the argument. in this article the arithemtic return is 25% ( (.5+2)/2)and the geometric return is 0% (.5*2)
volatility pumping is also just another version of rebalancing.the pay-off to rebalancing is greatest when the returns of all the assets in an investment universe are the same, all the volatilities are high and the same and all of the correlations are zero. the pay-off to rebalancing falls off rapidly when any of these three requirements are changed.....
I agree with Anon 10:01--this is a confusion of an arithmetic/geometric return type.
Let's say you open an envelope and it contains $100. That means the total amount of money involved is either $150 or $300 dollars. If you should win 50% of the money on average, that means that you (over many trials) should win either $75 (if the system started with $150) or $150 (if the system started with $300). The original $100 doesn't have anything to do with it.
And of course you should take the other envelope--because you have a chance of getting a extra hundred bucks at the cost of losing fifty. If someone gave me an 50-50 chance of winning $100 or losing $50, I would hope I would take him up on it. There's no "paradox" about it.
Anyone who thinks this problem has anything to do with the real world has spent too much time in academia.
I think you are failing to appreciate some of the subtlety here.
Remember that you are allowed to OPEN YOUR ENVELOPE before you decide. Let's say you find $100 in your envelope. If you then still believe that there is an even chance that you got the higher envelope (conditional on having seen $100), then by standard Bayesian decision theory you should switch.
But what SHOULD your posterior probability of having the higher envelope be, after you see $100? That depends on your priors. It turns out that there do exist proper priors that lead you to ALWAYS want to switch. The only twist is that these priors have infinite expected value.
See here for more:
http://gowers.wordpress.com/2008/02/03/probability-paradox-ii/
(I assume you accept Bayesian decision theory, despite its limitations...if you have some theory of decision making beyond this, I'd like to hear about it.)
I guess it is more complicated than I thought (if Terrance Tao and Peter Shor think it nontrivial, it is). But it still must be silly to switch. I guess I'm more comfortable with the 'no rational, zero-sum gain' than mathematicians, who see this as fungible.
I think the key is that some 'finite' distribution generates a variable X, that is then doubled (2X). These are then randomly assigned to envelopes. The key is that the higher numbers will tend to be from the doubling. For example, if X~U[0,1], then clearly all envelopes with money>1 are guaranteed to half, not double, so such people would not trade. But then, knowing those with >1 would not trade, I would not trade if I had money between 0.5 and 1, because I know if he has >1, he won't trade. So anyone with >0.5 wont trade. But now it doesn't make sense to trade if you have money between 0.25 above for similar reason.
I think for any prior distribution with finite expected value one finds this infinite regress.
This is not a (Baysian) probability problem. It can be restated without a probability component & contrary to Ed's statement in his 11:09 post switching is not obvious. It is also not necessary to open the envelopes - indeed the original problem did not refer to seeing the contents.
If by construction the envelopes contain x & 2x, this means if you stick you gain x (because you hold 2x) or you lose x (because you hold x). To swap means you gain x (because you hold x) or you lose x (because you hold 2x). Hence the potential losses & potential gains are not different but equal.
Ed's analysis is incorrect because if you hold y & you swap, you gain y only if you held the lesser amount. If you held the greater amount your loss is not half y because the amount you initially held is NOT y since y is defined as the LESSER amount. The amount you held initially was 2Y so your loss is y. Hence your possible loss is y which IS equal to your potential gain.
You have a 50% chance of winning the greater amount, regardless of whether it's your first choice or your second choice. Looking in the envelope is irrelevant. So is the payoff.
Suppose one envelope had 10 or even 100 times more money. Nothing changes, you still have a fifty-fifty chance at it.
Which is different than the Monty Hall problem. Monty isn't picking the booby prize door at random to expose, so he's offering to exchange TWO doors for the contestant's one door.
I agree this is not a Monty Hall equivalent as that "puzzle" relies on the additional data received in round 2 about where the booby prize is not situated. This puzzle does not rely on any further data & does not require knowledge of what's in the initial envelope.
As I said in my 4:12 post, the probabilities are irrelevant. Even without probabilities there is a contradiction viz:
You have either x or 2x. Stick & you could win x or lose x. Swap & you could win x or lose x.
You have y. Stick & you could lose half y or win y. Swap & you could win y or lose half y.
Under scenario x the potential gains & losses are equal. Under scenario y the potential losses are less than the gains.
Irrespective of the 50/50 or any other probabilities, the logical rather than probability puzzle is how we explain the contradiction in potential outcomes between scenarios x & y?
My impression is that the "controversy" arises from the repeated character of the game. Opening one envelope, for a single-iteration game, gives you no information. Opening several does.
I see some parallels to this problem.
I am not sure this analysis holds for any distribution by which the $ amount per envelope is determined by the "game organizer", but my intuition is that it should hold for symmetric distributions. It would make for a fun Monte-Carlo simulation project to waste a weekend over, but does not strike me as the innovation of the year...
The above should probably, in the interest of clarity, read "Opening several, for a multiple-iteration game, does." instead of "Opening several does."
I agree this is not a Monty Hall equivalent as that "puzzle" relies on the additional data received in round 2 about where the booby prize is not situated.
No, it doesn't. It's strictly that Monty is offering to swap two doors for one. Revealing the booby prize is merely designed to obscure that, otherwise there would be no 'suspense'.
Under scenario x the potential gains & losses are equal. Under scenario y the potential losses are less than the gains.
No, you're making it too complicated. Think of the larger amount as 'winning', and the lesser amount as the booby prize. It's just like flipping a coin. The amount of the payoff is irrelevant.
Suppose you have the choice of two envelopes, and start to reach for the one on your left. Do you enhance your chances by stopping and instead picking the one on your right?
It's just completely obvious that if you don't open the envelope, switching or sticking on this trivial two-envelope problem are the same.
If you open the envelope, you can only change the numbers if you think that the amount of money in the envelope contains some information about whether it's the bigger or smaller number. Perhaps it does, but the question is psychological rather than mathematical. If you get $50 (or $5, or $500...), it's likely to be the smaller amount if you have a straightforward quizmaster. If you get $200, it's probably the larger amount.
It IS key that one booby prize is revealed in the Monty Hall problem.
This is because the first round probability of winning is 1/3 but the additional knowledge in round 2 improves the probability to 2/3 if you swap.
If the booby prize were not revealed you would have a 50/50 choice of 2 new doors with combined winning probability of 2/3 i.e. a 1/3 chance of picking the winning door, which is only what you already have with your first round pick. Showing the booby prize in 1 of the 2 remaining doors means all the winning 2/3 chance is in one remaining door. This is twice as good as the 1/3 chance you had when you selected your original door in round 1, so you should swap.
Post a Comment