Department of Mathematics, Statistics
and Computer Science
Wim Ruitenburg's Fall 2007 MATH025.1001
At several places in the book we find topics related to probability.
We choose some, and add further details below.
Three Doors
This problem is also referred to as the Monty Hall problem.
There is a situation sketch is the book, Section 1.1, pages 10-11, and some
further comments on page 518.
The game host shows us three doors, behind one of which there is a big prize.
We are asked to pick one of the three doors.
Once our choice is final, and the prize is behind the door, we have won.
Obviously, our chance of winning is one-third, or 1/3.
What happens with our chances when the game show host adapts the rules a bit?
Suppose we have picked a door, say door one.
Then the game host opens another door, say door two, and reveals that there is
no prize behind door two.
Now we are offered the options of sticking with door one, or switch to door
three.
Should we switch, or should we stay?
What are our chances for door one, or for door three?
- Here is the standard answer.
There is a 1/3 chance that the prize is behind door one.
There is a 2/3 chance that the prize is behind door two or behind door three.
So although the show host has ruled out door two, the prize is still behind
doors two or three with chance 2/3.
The difference is that we also know that door two covers no prize.
So the chance for the prize being behind door three is 2/3.
So switch to door three!
- Many people have an intuitive resistance to the answer above.
This may be because of `hidden' assumptions.
For example, we assume that the three doors are not made of glass.
For otherwise we can see the prize through the door.
We also assume that the show host is telling the truth.
This and similar restrictions are usually perceived as obvious from the
context.
Other assumptions may be less obvious.
- Suppose the show host always makes the observation about there
being another door without a prize behind it, for all participants.
Then the argument above applies.
- Suppose the show host randomly makes the observation about
there being another door without a prize behind it, for all participants.
Then the argument above applies.
- Suppose the show host only makes the observation about there
being another door without a prize behind it, for participants whose original
choice was correct.
Then the above argument is obviously false: Never change.
- Suppose the show host makes the observation about there being another
door without a prize behind it, for the first time.
Then it depends on whether or not the show host is trying to trick us.
We may not know.
Let us change the problem in a seemingly irrelevant way as follows.
As before, we are asked to pick one of the three doors.
Once our choice is final, and the prize is behind the door, we have won.
Obviously, our chance of winning is 1/3.
Suppose we have picked door one.
Then a storm blows through the hall, and one of the doors is randomly blown
open.
Suppose that door two blew open.
Behind it there is no prize.
Now we are offered the options of sticking with door one, or switch to door
three.
Should we switch, or should we stay?
- Answer: It makes no difference.
The chance that the prize is behind door one is 1/2.
The chance that the prize is behind door three is 1/2.
A door flew open randomly, so the search has been reduced to door one and
door three, without preference.
It is only coincidence that the door with the prize did not fly open.
The Other Child
Section 7.1, pages 518-519, sketches the problem of whether or not the
other child is a girl.
Actually, the authors rather asked whether the other child is also a boy.
Suppose a parent has two children.
- If the oldest is a boy, what are the chances that the other is a girl?
- If, instead, we only know that one of the children is a boy, what are
the chances that the other is a girl?
Again there are hidden assumptions.
In this case, the `reasonable' hidden assumption is that each next child has an
equal and independent chance of being a boy or a girl.
Assuming this, the two chances are 1/2 and 2/3.
Spinning Wheels
Section 8.1, page 641, illustrates some spinning wheels.
We are interested in finding the expected values of such spinning wheels.
When is the game fair?
When is the game unfair in our favor?
When is the game unfair in favor of the wheel owner?
Winning a Losing Game
Suppose you play a color at roulette.
You can pick either black or red.
Your chance for doubling your money is 18/38, your chance for losing your money
is 20/38.
- If you play the game 3800 times by each time betting 1 dollar on a
color, you should expect a return of about 1800 times 2 dollars plus 2000 times
0 equals 3600 dollars.
So you lose about 200 dollars.
The casino wins 200 dollars.
- Instead of repeatedly playing for the same amount, let us adapt our
gambling strategy as follows.
- The first time you bet some amount, say 5 dollars, on a color.
If you win, you get 10 dollars, and you stop with an overall profit of 5
dollars.
Otherwise, your total los is 5 dollars, and you continue as follows.
- The second time you double your bet to 10 dollars.
If you win, you get 20 dollars, and you stop with an overall profit of 20 minus
15 is 5 dollars.
Otherwise, your total los is 15 dollars, and you continue as follows.
- The third time you double your bet to 20 dollars.
If you win, you get 40 dollars, and you stop with a profit of 40 minus 35 is 5
dollars.
Otherwise, your total los is 35 dollars, and you continue as follows.
- The fourth time you double your bet to 40 dollars.
If you win, you get 80 dollars, and you stop with a profit of 80 minus 75 is 5
dollars.
Otherwise, your total los is 75 dollars, and you continue as follows.
- The fifth time you double your bet to 80 dollars.
If you win, you get 160 dollars, and you stop with a profit of 160 minus 155 is
5 dollars.
Otherwise, your total los is 155 dollars, and you continue as follows.
- The sixth time you double your bet to 160 dollars.
If you win, you get 320 dollars, and you stop with a profit of 320 minus 315 is
5 dollars.
Otherwise, ......
- The seventh time you double your bet to 320 dollars.
If you win, you get 640 dollars, and you stop with ......
- The eighth time you double your bet to 640 dollars. ......
The chance that you keep losing is very small, so there is a very big chance
that you will win 5 dollars.
The strategy is very dangerous, because there is a small risk of a gigantic
loss, when there is no more opportunity to double your bet.
Mean and Median
Suppose there are 50 people in class, 48 registered students, 1 teacher aide,
and the teacher.
- Suppose that the 50 of us combined get (`earn') 50000 dollars each week.
Then our mean, or average, income is 1000 dollars each week.
- Suppose that, additionally, the teacher gets 49000 dollars each week,
and the teacher aide gets 1000 dollars each week.
Each of the 48 registered students get 0 dollars.
Then the median, or middle, income is 0 dollars each week.
For other illustrations of the difference between average and middle, see the
book, pages 586-587.
The Birthday Paradox
The book discusses the birthday paradox on pages 530-534.
During the class we had about 50 people.
- What is the chance that some of us have the same birthday?
No, it is not about 0.15.
- As the book shows, it is relatively easier to first compute the chance
that all our birthdays are different.
The chance that we all have different birthdays is only about 0.03.
The chance that at least two of us share a birthday, is therefore about 1 -
0.03 = 0.97.
- Easy exercise:
Suppose the class had 367 people.
What would be the chance that some of us have the same birthday?
What are the hidden assumptions about the distribution of birthdays of people?
The Best of Three Dice
Suppose we have three fair but unusual dice, as follows.
Instead of having six sides with values 1, 2, 3, 4, 5, and 6, their sides are
allowed to have other values.
We have a red, a white, and a blue one.
The red one has numbers 2, 2, 2, 2, 6, and 6 on its six sides.
The white one has numbers 1, 1, 5, 5, 5, and 5 on its six sides.
The blue one has numbers 3, 3, 3, 3, 3, and 3 on its six sides.
When we list the six sides in tables, we have a situation like this:
The following game is played between two players.
The first player picks one on the dice.
Then the second player picks another.
Next, both players throw their dice simultaneously.
Whoever throws the higher number, wins.
With the three dice above, draws are not possible.
Question: Which one of these three dice is best?
- Let us compare dice two at a time.
- Red versus white.
When we toss red versus white, there is a 1/3 chance for red to throw 6, and so
certainly win.
If red throws 2 (chance 2/3), there still is a 1/3 chance for white to throw 1,
and red still wins.
So the chances for red to beat white are (1/3) + (2/3) x (1/3) = 5/9, for a
total of 5/9.
So red is better than white.
- White versus blue.
Blue always throws a 3.
There is a 2/3 chance that white throws a 5, and wins.
Otherwise, white throws a 1 and loses.
So the chances for white to beat blue are 2/3.
So white is better than blue.
Are we done yet?
If red is better than white, and white is better than blue, shouldn't color red
be a winner?
Let us check to make sure.
- Blue versus red.
Blue always throws 3.
If red throws 2 (chance 2/3), then blue wins.
Otherwise red wins.
So the chances for blue to beat red are 2/3.
So blue is better than red.
- There is no `best' one among the three dice.
On average, red beats white, white beats blue, and blue beats red.
In the game above, it is good to be the second player.
No matter which color the first player picks, the second player can always
pick another color which is better than the first.
Check Swapping
Suppose we have a stack of checks with denominations of 1 dollar, 2
dollars, 4 dollars, 8 dollars, 16 dollars, 32 dollars, 64 dollars, and 128
dollars.
(We rather arbitrarily stop at 128; you may change the problem by extending the
pile of possible checks with many more doublings.)
We also have two blank envelops.
In each of the envelops I put one check such that, with equal probability, I
either pick 1 and 2, or 2 and 4, or 4 and 8, or 8 and 16, or 16 and 32, or
32 and 64, or 64 and 128.
You can see the two envelops.
You do not know which two consecutive checks I picked.
You know that one envelop contains twice the amount of the other, but you don't
know which one.
- Suppose we have the two blank envelops in front of us as shown.
Now pick one of the envelops and open it, say we pick the left one.
Let us assume it contains a check for 4 dollars.
So now you know
What is in the other envelop?
With chance 1/2 it contains a 2 dollar check, and with chance 1/2 it contains
an 8 dollar check.
So if you were allowed to switch checks, then your average expected return, the
expected value, is 1/2 times 2 plus 1/2 times 8 equals 5 dollars, which
is 25 percent above what you have with your original choice.
So switch if you are allowed to!
If your originally chosen check shows any amount other than 128 dollars then,
by a similar argument, switch and on average expect a profit of 25 percent, or
more.
- In the scenario above, if you had only opened the right envelop, with
its 2 dollar check or 8 dollar check, you would also have wanted to switch,
with an expected value 25 percent above what you have with your original
choice.
So if two parties each had picked up and opened different envelops, then they
both want to switch with the other, and expect a 25 percent profit on average.
- Suppose we start all over, and again have two blank envelops in front of
us as before, without knowing about the contents except that one of the checks
has twice the value of the other.
Now pick one of the envelops, say you pick the left one.
Do not open it.
We mark the chosen check by replacing the ? by an x, where x stands for the
amount on the check, still unknown to you.
Would it benefit you to switch envelops?
It is not likely that you picked the check with the maximal amount of 128
dollars.
Assuming that, you may expect that the other check has with probability 1/2 an
amount of at least x/2, and with with probability 1/2 an amount of 2x.
So on average at least an amount of 5x/4.
So it appears reasonable to switch.
Let us choose a switch.
We mark the change in choice by replacing x by ?, and by putting a y on the
newly chosen envelop, where y stands for the amount on the check, still unknown
to you.
You may expect an increase in return by about 25 percent.
However, would it benefit you to switch envelops again, and return to your
first choice?
If we believe in the switch from the check with x dollars to the check with y
dollars, then we ought to believe so.
You end up where you started, with the check marked x.
How can this be?
The reason is that we carelessly assumed that we could ignore the possibility
that the check in your chosen envelop happens to be the one of 128 dollars.
- The game can be made much more complicated.
Suppose we start again from the beginning with the two checks:
You pick an envelop, say the left one.
Assume it contains a check for 64 dollars.
So now you know
Another student receives the other envelop, and looks inside.
Before you can say whether or not you want to switch envelops, the other
student says that she would like to switch.
Would it be beneficial to agree with her to switch?
Your original argument for switching was that the other envelop has equal
chances to either contain a 32 dollar check, or a 128 dollar check, which
averages to 80 dollars.
However, if the owner of the other envelop wants to switch, then it stands to
reason that she did not see the maximal amount of 128 dollars in her envelop.
You probably should not switch.
Last updated: November 2007
Comments & suggestions:
wimr@mscs.mu.edu