There are numerous web pages devoted to discussion and analysis of the Monty
Hall problem. I hope to add something new and interesting to the discussion with
an analysis of a more interesting problem.
For those of you living in a mathematical cave, I'll briefly explain the
problem and very little on the controversy. In 1991, Parade Magazine
published a column "Ask Marilyn" in which Marilyn vos Savant replies to
a reader's question:



"Suppose you're on a game show, and you're given the choice of three doors:
Behind one door is a car; behind the others, goats. You pick a door, say No. 1,
and the host, who knows what's behind the other doors, opens another door, say
No. 3, which has a goat. He then says to you, 'Do you want to pick door No. 2?'
Is it to your advantage to take the switch?"

Marilyn replied that the answer was yes, and there was a big uproar
and mathematicians from across the country attacked her. I'll spare
you the details. You can do a web search on that if you're interested.
I'm more interested in the mathematics.
First off, let's answer the original question. Should you switch? The correct answer
is that there is not enough information in the problem as stated to answer the question.
Marilyn assumed that the host always opens a door and shows the contestant a
goat, regardless of whether the contestant has already picked the "right" door or not.
(right meaning the one with the car). So let's first answer the question based on that
assumption. Should you switch? The answer is yes. Some people find this counterintuitive.
It is especially counterintuitive if they don't have Marilyn's assumption stated
explicitly. Even with the assumption stated, why wouldn't it just be a 5050 proposition?
It is pretty easy to get past the counterintuitiveness with the following simple explanation:
There are basically two cases, you've either picked the right door to start with, or you haven't:
 Case 1: You've picked the right door to start with. If you switch, you'll lose.
 Case 2: You've picked the wrong door to start with. The host shows you the other wrong
door, so if you switch, you'll clearly win.
Now, Case 1 only happens 1/3 of the time, and Case 2 happens 2/3 of the time. Therefore, if
you follow the strategy of always switching, you'll win 2/3 of the time. Whereas, if you
never switch, you'll of course get the right door only 1/3 of the time. If that's not
clear enough, take a look at the following table. In this and all tables in this
analysis, "EV" refers to "expected value."
Case 
Original door choice 
Probability 
Don't Switch Outcome 
Switch Outcome 
Don't Switch EV 
Switch EV 
1 
Right 
1/3 
Win 
Lose 
1/3 
0 
2 
Wrong 
2/3 
Lose 
Win 
0 
2/3 
Total 

1 


1/3 
2/3 
Now, you may be asking yourself: why would the host give you an option to switch if you've
picked the wrong door to begin with? An excellent question. Which leads to the crux of this
analysis. In general, I'd like to analyze the problem given a variety of host behaviors. In
particular, here are the questions I plan to answer:
If the host only gives you the switching option when you've picked the right door, should you switch?
Of course not!
If the host only gives you the switching option when you've picked the wrong door, should you switch?
Of course!
If the host only gives you the switching option 50% of the time, regardless of whether you've picked the right door, should you switch?
Yes.
This is really just a variation of Marilyn's original assumption. In fact, if the host makes the decision
without regard to whether you've picked the right door or not, then you should always switch, no matter what
the percentage is. But for clarity's sake, let's analyze the 50% question. We can break this down into 4 cases:
Case 
Original door choice 
Host gives switch option 
Probability 
Don't Switch Outcome 
Switch Outcome 
Don't Switch EV 
Switch EV 
1a 
Right 
Yes 
1/6 
Win 
Lose 
1/6 
0 
1b 
Right 
No 
1/6 
Win 
Win (can't switch) 
1/6 
1/6 
2a 
Wrong 
Yes 
1/3 
Lose 
Win 
0 
1/3 
2b 
Wrong 
No 
1/3 
Lose 
Lose (can't switch) 
0 
0 
Total 


1 


1/3 
1/2 
Since the expected value for the switch strategy (1/2) is greater than the expected value for
the don'tswitch strategy (1/3), it is to your benefit to use the switch strategy.
Now let's look at the more general case. Suppose the host gives you the switch option X percent of the time (X is a number from 0 to 100). Then the table
looks like this:
Case 
Original door choice 
Host gives switch option 
Probability 
Don't Switch Outcome 
Switch Outcome 
Don't Switch EV 
Switch EV 
1a 
Right 
Yes 
X / 300 
Win 
Lose 
X / 300 
0 
1b 
Right 
No 
(100  X) / 300 
Win 
Win (can't switch) 
(100  X) / 300 
(100  X) / 300 
2a 
Wrong 
Yes 
2 * X / 300 
Lose 
Win 
0 
2 * X / 300 
2b 
Wrong 
No 
2 * (100  X) / 300 
Lose 
Lose (can't switch) 
0 
0 
Total 


1 


1/3 
(X + 100) / 300 
If you follow the don't switch strategy, you'll win 1/3 of the time. If you always switch, your
expected value is (X + 100) / 300, which is greater than 1/3 (unless X = 0, in which case you
are never given the option of switching). Therefore, you should follow the strategy of always
switching when given the opportunity.
If the host only gives you the switching option 100% of the time when you pick the right door, but 2/3 of the time when you've picked the wrong door, should you switch?
Yes.
Let's use the table approach again:
Case 
Original door choice 
Host gives switch option 
Probability 
Don't Switch Outcome 
Switch Outcome 
Don't Switch EV 
Switch EV 
1a 
Right 
Yes 
1/3 
Win 
Lose 
1/3 
0 
1b 
Right 
No 
0 
N/A 
N/A 
0 
0 
2a 
Wrong 
Yes 
4/9 
Lose 
Win 
0 
4/9 
2b 
Wrong 
No 
2/9 
Lose 
Lose (can't switch) 
0 
0 
Total 


1 


1/3 
4/9 
Since 4/9 > 1/3, the strategy of switching whenever given the opportunity is best in this case.
What strategy should the host employ, such that you do not know whether you should switch or not (switching would be a 50/50 proposition)?
This seems like a good question to ask, to keep the game show as entertaining as possible. For the host strategy to work like this,
we want to create a situation where the expected value for the don'tswitch strategy is the same as the expected value for the switch strategy.
Look at the above table. If we change the 2/3 probably to something else, we can affect the switchstrategy expected value without affecting
the don'tswitch strategy EV. If we give the switch option X percent of the time, we want 1/3 = (2/3) * (X / 100). Solving for X, we get
X = 50. The following table bears that out:
Case 
Original door choice 
Host gives switch option 
Probability 
Don't Switch Outcome 
Switch Outcome 
Don't Switch EV 
Switch EV 
1a 
Right 
Yes 
1/3 
Win 
Lose 
1/3 
0 
1b 
Right 
No 
0 
N/A 
N/A 
0 
0 
2a 
Wrong 
Yes 
1/3 
Lose 
Win 
0 
1/3 
2b 
Wrong 
No 
1/3 
Lose 
Lose (can't switch) 
0 
0 
Total 


1 


1/3 
1/3 
Thus, if we always give the option of switching when the contestant picks the winning door, and give them the option 50% of
the time when they pick the wrong door, then they won't know whether to switch or not.
Are there other host strategies which have the same property? Suppose that if they pick the right door, we'll give them
the switching option X percent of the time; if they pick the wrong door, we'll give them the switching option Y percent of
the time. Then the table looks like this:
Case 
Original door choice 
Host gives switch option 
Probability 
Don't Switch Outcome 
Switch Outcome 
Don't Switch EV 
Switch EV 
1a 
Right 
Yes 
X / 300 
Win 
Lose 
X / 300 
0 
1b 
Right 
No 
(100  X) / 300 
Win 
Win (can't switch) 
(100  X) / 300 
(100  X) / 300 
2a 
Wrong 
Yes 
2 * Y / 300 
Lose 
Win 
0 
2 * Y / 300 
2b 
Wrong 
No 
2 * (100  Y) / 300 
Lose 
Lose (can't switch) 
0 
0 
Total 


1 


1 / 3 
(100 + 2Y  X) / 300 
We want 1/3 = (100 + 2Y  X) / 300. Solving for this, we find we want X = 2Y. X = 100, Y = 50 is the solution we found above, but any
similar host strategy will work (X = 50, Y = 25, etc.). To maximize the frequency of giving the switch option and keeping the game
entertaining, we'll want the (X = 100, Y=50) solution.



If the host gives the switch option X percent of the time when the contestant originally picks the right door, and
2X percent of the time when the contestant originally picks the wrong door, then the contestant will not know whether
or not to switch.

To round out the discussion of this problem, if you know X and Y (suppose you do a statistical analysis of
previous shows), then how do you tell whether to switch or not? Going back to the old table,
it only makes sense to switch if (100 + 2Y  X) / 300 > 1/3. Solving the equation, we find
we only want to switch if X < 2Y. So, for example, if the host gives us the option of switching
60% (X = 60) of the time when we've picked the right door, and 25% (Y = 25) of the time when we've picked the
wrong door, then we shouldn't switch (since 60 is not less than 25*2).



If the host gives the switch option X percent of the time when the contestant originally picks the right door, and
Y percent of the time when the contestant originally picks the wrong door, then the contestant should only switch
if X < 2Y.

Let's extend this even further. Let's suppose that there are N doors instead of 3 doors (N >= 3),
and there's still just one car and there's a goat behind each of the other N1 doors. Then what
are the odds? In this case, if we pick the wrong door and switch, then we have a 1 / (N  2)
chance of picking the right door, since we can eliminate two of the doors (the one we picked,
and the one the host picked). Then the table looks like this:
Case 
Original door choice 
Host gives switch option 
Probability 
Don't Switch Outcome 
Switch Outcome 
Don't Switch EV 
Switch EV 
1a 
Right 
Yes 
(X / N) / 100 
Win 
Lose 
(X / N) / 100 
0 
1b 
Right 
No 
((100  X) / N) / 100 
Win 
Win (can't switch) 
((100  X) / N) / 100 
((100  X) / N) / 100 
2a 
Wrong 
Yes 
(Y(N  1) / N) / 100 
Lose 
Win 
0 
(Y(N  1) / N(N  2)) / 100 
2b 
Wrong 
No 
((100  Y)(N  1) / N) / 100 
Lose 
Lose (can't switch) 
0 
0 
Total 


1 


1 / N 
((100  X) + (Y(N  1)/(N  2))/100N 
We should switch if ((100  X) + (Y(N  1)/(N  2))/100N > 1/N. Simplifying the equation,
we want to switch if Y(N  1)/(N  2) > X.
I leave it as an exercise to the reader to solve the case for N doors, with cars behind M
of the doors (0 < M < N  1).
