Department of Mathematics, Statistics
and Computer Science
Wim Ruitenburg's Spring 2004 MATH025.1001
Probably
- Early in this class we studied the Three Doors problem, or
Monty Hall problem.
It combines unexpected probabilities with our hidden assumptions.
Below, each time when we sketch a situation and are asked to find
probabilities, we hope to agree on which assumptions are reasonable, and which
are not.
- The book, on page 516, introduces the sibling problem.
Suppose a parent has two children.
- If the oldest is a boy, what are the chances that the other is a girl?
- If, instead, we only know that one of the children is a boy, what are
the chances that the other is a girl?
A `reasonable' hidden assumption is that each next child has an equal and
independent chance of being a boy or a girl.
Then the two chances are 1/2 and 2/3.
- The book, on pages 529-531, discusses the birthday paradox.
During the class we were with 35 people.
- What is the chance that some of us have the same birthday?
No, it is not about 1/10.
It is about 8/10, see the table on page 531.
The chance that we all have different birthdays is only 2/10.
It happened.
- Mary Lou had to leave the class early, and was not among the 35 people
mentioned above.
I pick up one of the pieces of paper with one of our 35 birthdays, and wish
there were someone else with that special birthday.
Megan suggests that Mary Lou has that birthday (although Megan doesn't know).
It happened.
What were the chances for that coincidence?
What was the hidden assumption about the distribution of birthdays of people?
- The book discusses spinning tops on page 579. in problem 17.
These are nice for getting a feel for probability and expected value.
- The book discusses and compares mean and median on page 593.
When we sample data like individual salaries, we only know that we get a good
idea what the median value is.
The mean may be very different.
See the book on pages 594-595, section School Daze.
- How to win a losing game.
Suppose you play a color at roulette.
You can pick either black or red.
Your chance for doubling your money is 18/38, your chance for losing your money
is 20/38.
- If you play the game 3800 times by each time betting 1 dollar on a
color, you should expect a return of about 1800 times 2 dollars plus 2000
times nothing equals 3600 dollars.
So you lose about 200 dollars.
The casino wins 200 dollars.
- Instead of repeatedly playing for the same amount, let us adapt our
gambling strategy as follows.
- The first time you bet some amount, say 5 dollars, on a color.
If you win, you get 10 dollars, and you stop with an overall profit of 5
dollars.
Otherwise, your total los is 5 dollars, and you continue as follows.
- The second time you double your bet to 10 dollars.
If you win, you get 20 dollars, and you stop with an overall profit of 20 minus
15 is 5 dollars.
Otherwise, your total los is 15 dollars, and you continue as follows.
- The third time you double your bet to 20 dollars.
If you win, you get 40 dollars, and you stop with a profit of 40 minus 35 is 5
dollars.
Otherwise, your total los is 35 dollars, and you continue as follows.
- The fourth time you double your bet to 40 dollars.
If you win, you get 80 dollars, and you stop with a profit of 80 minus 75 is 5
dollars.
Otherwise, your total los is 75 dollars, and you continue as follows.
- The fifth time you double your bet to 80 dollars.
If you win, you get 160 dollars, and you stop with a profit of 160 minus 155 is
5 dollars.
Otherwise, your total los is 155 dollars, and you continue as follows.
- The sixth time you double your bet to 160 dollars.
If you win, you get 320 dollars, and you stop with a profit of 320 minus 315 is
5 dollars.
Otherwise, ......
- The seventh time you double your bet to 320 dollars.
If you win, you get 640 dollars, and you stop with ......
- The eighth time you double your bet to 640 dollars. ......
The chance that you keep losing is very small, so there is a very big chance
that you will win 5 dollars.
The strategy is very dangerous, because there is a small risk of a gigantic
loss, when there is no more opportunity to double your bet.
- Suppose we have three fair but unusual dice, as follows.
Instead of having six sides with denominations for 1, 2, 3, 4, 5, and 6, their
sides are allowed to have other denominations.
We have a red, a white, and a blue one.
The red one has numbers 2, 2, 2, 2, 6, and 6 on its six sides.
The white one has numbers 1, 1, 5, 5, 5, and 5 on its six sides.
The blue one has numbers 3, 3, 3, 3, 3, and 3 on its six sides.
When we list the six sides in tables, we have a situation like this:
When two players each pick one of these dice, they can play the game of
simultaneously tossing their dice, and win whoever rolls the highest number.
Note that, with the three dice above, draws are not possible.
The question is which of these three dice is best.
So let us compare:
- Red versus white. When we repeatedly toss red versus white, then on
average the red one will beat the white one with probability 2/3 times 1/3 plus
1/3 times 1 equals 5/9.
So red is better than white.
- White versus blue. When we repeatedly toss white versus blue, then on
average the white one will beat the blue one with probability 1/3 times 0 plus
2/3 times 1 equals 2/3.
So white is better than blue.
- Blue versus red. When we repeatedly toss blue versus red, then on
average the blue one will beat the red one with probability 1 times 2/3 equals
2/3.
So blue is better than red.
There is no `best' one among the three dice.
If someone challenges you for a game of dice mentioned above, then let the
other person pick one of the three dice first.
Then you can always pick another one which is better.
- Suppose we have a stack of checks with denominations of 1 dollar, 2
dollars, 4 dollars, 8 dollars, 16 dollars, 32 dollars, 64 dollars, and 128
dollars.
(We rather arbitrarily stop at 128; you may change the problem by extending the
pile of possible checks with many more doublings.)
We also have two blank envelops.
In each of the envelops I put one check such that, with equal probability, I
either pick 1 and 2, or 2 and 4, or 4 and 8, or 8 and 16, or 16 and 32, or
32 and 64, or 64 and 128.
You can see the two envelops, but you do not know which two consecutive
checks I picked, or which of the two envelops contains the larger
denomination.
Here are some scenarios of how the situation may unfold.
- Suppose we have the two blank envelops in front of us as shown.
Now pick one of the envelops and open it, say we pick the left one.
Let us assume it contains a check for 4 dollars.
So now you know
Now what is in the other envelop?
With chance 1/2 it contains a 2 dollar check, and with chance 1/2 it contains
an 8 dollar check.
So if you were allowed to switch checks, then your average expected return
would be 1/2 times 2 plus 1/2 times 8 equals 5 dollars.
So switch if you are allowed to!
If your originally chosen check shows any amount other than 128 dollars then,
by the same argument, switch and on average expect a profit.
- Suppose we have the two blank envelops in front of us as in the
beginning, without knowing anything new about their contents.
Now pick one of the envelops, say you pick the left one.
Do not open it.
We mark the chosen check by replacing the ? by an x, where x stands for the
amount on the check, still unknown to you.
Would it benefit you to switch envelops?
It is not likely that you picked the check with the maximal amount of 128
dollars.
Assuming that, you may expect that the other check has with probability 1/2 an
amount of at least x/2, and with with probability 1/2 an amount of 2x.
So on average at least an amount of 5x/4.
So it appears reasonable to switch.
So switch.
We mark the chosen check by replacing the ? by an y, where y stands for the
amount on the check, still unknown to you.
Would it benefit you to switch envelops again?
By the argument above, yes.
You end up where you started, with the left check marked x.
How can this be?
The reason is that we carelessly assumed that we could ignore the possibility
that the check in your chosen envelop happens to be the one of 128 dollars.
- The game can be made much more complicated.
Suppose we start again from the beginning with the two checks:
You pick an envelop, say the left one.
Assume it contains a check for 64 dollars.
So now you know
Another student receives the other envelop, and looks inside.
Before you can say whether or not you want to switch envelops, the other
student says that she would like to switch.
Would it be beneficial to agree with her to switch?
Your original argument for switching was that the other envelop has equal
chances to either contain a 32 dollar check, or a 128 dollar check, which
averages to 80 dollars.
However, if the owner of the other envelop wants to switch, then it stands to
reason that she did not see the maximal amount of 128 dollars in her envelop.
You probably should not switch.
Last updated: April 2004
Comments & suggestions:
wimr@mscs.mu.edu