Department of Mathematical and Statistical Sciences
Wim Ruitenburg's Fall 2024 MATH 1300-101
Last updated: 1 December 2024
Comments and suggestions: Email wim.ruitenburg@marquette.edu
Book, chapter 16 on probability, plus extra notes
From the chapter we can learn:
- The sample space is the collection of basic possible outcomes.
Exactly one outcome must occur after each probabilistic event.
- Each potential outcome has a probability value, which is a value of at
least 0 and at most 1.
- The sum of all probabilities of all outcomes equals 1.
- An event is a collection of possible outcomes.
For example, for a single die the collectio of basic possible outcomes of a
single roll is {1,2,3,4,5,6}.
The event of rolling an even number is the collection {2,4,6}.
Some rules of computing the number of elements of a sample space are:
- The multiplication rule.
- The complement rule.
- Counting permutations.
- Counting combinations.
These formulas are particularly useful when the probability space is
equiprobable.
Repeated experiments:
- When we toss a fair coin 5 times, there is a chance of
(1/2)*(1/2)*(1/2)*(/1/2)*(1/2) = 1/(2^5) = 1/32 of having 5 tails in a row.
- When we toss a fair coin 5 times, there is a chance of 5 *
(1/2)*(1/2)*(1/2)*(/1/2)*(1/2) = 5/(2^5) = 5/32 of having 4 tails and 1 head in
some order, because this 1 head may occur at the 1st, the 2nd, the 3rd, the
4th, or the 5th toss.
- Expected value (or: expected payoff).
For example, suppose we have a fair coin with collection of equiprobable basic
outcomes {H,T}.
If H (heads) comes up in a toss, then you get 0 points.
If T (tails) comes up in a toss, then you get 1 point.
The expected value equals E = 0 * 0.5 + 1 * 0.5 = 0.5
If you toss the coin 1000 times, expect to earn about 1000 * 0.5 = 500 points.
Three Doors
This problem is also referred to as the Monty Hall problem.
The game host shows us three doors, a red door, a white door, and a blue door.
Behind one of these doors there is a big prize.
We are asked to pick one of the three doors.
Once our choice is final, and the prize is behind the door, we have won.
Obviously, our chance of winning is one-third, or 1/3.
What happens with our chances when the game show host adapts the rules a bit?
Suppose we have picked a door, say the red door.
Then the game host opens another door, say the white door, and reveals that
there is no prize behind the white door.
Now we are offered the options of sticking with the red door, or switch to the
blue door.
Should we switch, or should we stay?
What are our chances for the red door, what for the blue door?
- Here is the standard answer.
There is a 1/3 chance that the prize is behind the red door.
There is a 2/3 chance that the prize is behind the white door or behind the
blue door.
Although the show host has ruled out the white door, the prize is still behind
the white or the blue door with chance 2/3.
The difference is that we also know that the white door covers no prize.
So the chance for the prize being behind the blue door is 2/3.
So switch to the blue door!
- Many people have an intuitive resistance to the answer above.
This may be because of `hidden' assumptions.
For example, we assume that the three doors are not made of glass.
For otherwise we can see the prize through the door.
We also assume that the show host is telling the truth.
This and similar restrictions are usually perceived as obvious from the
context.
Other assumptions may be less obvious.
- Suppose the show host always makes the observation about there
being another door without a prize behind it, for all participants.
Then the argument above applies.
- Suppose the show host randomly makes the observation about
there being another door without a prize behind it, for all participants.
Then the argument above applies.
- Suppose the show host only makes the observation about there
being another door without a prize behind it, for participants whose original
choice was correct.
Then the above argument is obviously false: Never change.
- Suppose the show host makes the observation about there being another
door without a prize behind it, for the first time.
Then it depends on whether or not the show host is trying to trick us.
We may not know.
- A little extra information goes a long way.
For example, suppose we are in the standard situation of the three doors
problem, where the show host always makes the observation about there
being another door without a prize behind it.
Now before we make our first choice of one of the three doors, we discover that
there is no prize behind the blue door without anyone else realizing that we
know that there is no prize behind the blue door.
Question: Which door should we first pick, and why?
Remarkably, the answer is pick the blue (bad) door deliberately.
Can you explain why we so can guarantee to win the prize?
Let us change the problem in a seemingly irrelevant way as follows.
As before, we are asked to pick one of the three doors.
Once our choice is final, and the prize is behind the door, we have won.
Obviously, our chance of winning is 1/3.
Suppose we have picked the red door.
Then a storm blows through the hall, and one of the doors is randomly blown
open.
Suppose that the white door blew open.
Behind it there is no prize.
Now we are offered the options of sticking with the red door, or switch to the
blue door.
Should we switch, or should we stay?
- Answer: It makes no difference.
The chance that the prize is behind the red door is 1/2.
The chance that the prize is behind the blue door is 1/2.
A door flew open randomly, so the search has been reduced to the remaining two
doors, without preference.
It is only coincidence that the door with the prize did not fly open.
The Best of Three Dice
Suppose we have three fair but unusual dice, as follows.
Instead of having six sides with values 1, 2, 3, 4, 5, and 6, their sides are
allowed to have other values.
We have a red, a white, and a blue one.
The red one has numbers 2, 2, 2, 2, 6, and 6 on its six sides.
The white one has numbers 1, 1, 5, 5, 5, and 5 on its six sides.
The blue one has numbers 3, 3, 3, 3, 3, and 3 on its six sides.
When we list the six sides in tables, we have a situation like this:
The following game is played between two players.
The first player picks one on the dice.
Then the second player picks another.
Next, both players throw their dice simultaneously.
Whoever throws the higher number, wins.
With the three dice above, draws are not possible.
Question: Which one of these three dice is best?
- Let us compare dice two at a time.
- Red versus white.
When we toss red versus white, there is a 1/3 chance for red to throw 6, and so
certainly win.
If red throws 2 (chance 2/3), there still is a 1/3 chance for white to throw 1,
and red still wins.
So the chances for red to beat white are (1/3) + (2/3) x (1/3) = 5/9, for a
total of 5/9.
So red is better than white.
- White versus blue.
Blue always throws a 3.
There is a 2/3 chance that white throws a 5, and wins.
Otherwise, white throws a 1 and loses.
So the chances for white to beat blue are 2/3.
So white is better than blue.
Are we done yet?
If red is better than white, and white is better than blue, shouldn't color red
be a winner?
Let us check to make sure.
- Blue versus red.
Blue always throws 3.
If red throws 2 (chance 2/3), then blue wins.
Otherwise red wins.
So the chances for blue to beat red are 2/3.
So blue is better than red.
- There is no `best' one among the three dice.
On average, red beats white, white beats blue, and blue beats red.
In the game above, it is good to be the second player.
No matter which color die the first player picks, the second player can always
pick a die of another color which is better than the first one.
The birthday paradox
Let us pretend that during the class we had about 45 people.
- What is the chance that some of us have the same birthday?
No, it is not about 1/8.
- It is relatively easier to first compute the chance that all our
birthdays are different.
Recall that we use a formula that looks like
(365/365) * (364/365) * (363/365) * and so on * ((365 - 43)/365) * ((365 -
44)/365)
With a calculator:
The chance that we all have different birthdays is close to 5%.
So the chance that at least two of us share a birthday, is therefore close to
100% - 5% = 95%.
- When there are about 23 people, the chance of two of us sharing the
same birthday is about 50%.
What are the hidden assumptions about the distribution of birthdays of people?
Winning a Losing Game
Suppose you play a color at roulette.
You can pick either black or red.
Your chance for doubling your money is 18/38, your chance for losing your money
is 20/38.
- If you play the game 3800 times by each time betting 1 dollar on a
color, you should expect to win 1 dollar about 1800 times and lose 1 dollar
about 2000 times.
So on balance you lose about 200 dollars.
The casino wins 200 dollars.
- Instead of repeatedly playing for the same amount, let us adapt our
gambling strategy as follows.
- The first time you bet some amount, say 5 dollars, on a color.
If you win, you get 10 dollars, and you stop with an overall profit of 5
dollars.
Otherwise, your total loss is 5 dollars, and you continue as follows.
- The second time you double your bet to 10 dollars.
If you win, you get 20 dollars, and you stop with an overall profit of 20 minus
15 is 5 dollars.
Otherwise, your total loss is 15 dollars, and you continue as follows.
- The third time you double your bet to 20 dollars.
If you win, you get 40 dollars, and you stop with a profit of 40 minus 35 is 5
dollars.
Otherwise, your total loss is 35 dollars, and you continue as follows.
- The fourth time you double your bet to 40 dollars.
If you win, you get 80 dollars, and you stop with a profit of 80 minus 75 is 5
dollars.
Otherwise, your total loss is 75 dollars, and you continue as follows.
- The fifth time you double your bet to 80 dollars.
If you win, you get 160 dollars, and you stop with a profit of 160 minus 155 is
5 dollars.
Otherwise, your total loss is 155 dollars, and you continue as follows.
- The sixth time you double your bet to 160 dollars.
If you win, you get 320 dollars, and you stop with a profit of 320 minus 315 is
5 dollars.
Otherwise, ......
- The seventh time you double your bet to 320 dollars.
If you win, you get 640 dollars, and you stop with ......
- The eighth time you double your bet to 640 dollars. ......
The chance that you keep losing is very small, so there is a very big chance
that you will win 5 dollars.
The strategy is very dangerous, because there is a small risk of a gigantic
loss, when there is no more opportunity to double your bet.
Example Problems
- We roll a red, a white, and a blue die in a single throw.
All dies are fair.
What is the probability that the red one comes up even, the white one comes up
greater than or equal to 5, and the blue one comes up less than 6 all at the
same time?
- When we roll 3 fair dice, what is the probability that none of the 3
dice shows a 1? (Use the product rule.)
- When we roll 3 fair dice, what is the probability that at least one of
the 3 dice shows a 1? (Use the previous result.)
- Suppose that in the Monty Hall problem we have 5 doors named A, B, C, D,
and E.
Behind one of the doors is the big prize we want.
We must guess, and we guess door D.
Then the game show hosts does the usual thing by opening 3 doors, in this case
doors A, C, and E, and shows that there is no prize behind those 3 doors.
What is the probability of the prize being behind door B, and what is the
probability for the prize being behind door D?
- In the game of three dice, let us add another one, say an orange die,
with numbers 3, 3, 3, 4, 4, 4.
Otherwise the rules of the game don't change.
Is the first player better off, or has nothing really changed and is the second
player still better of?
Justify your answer.
- Suppose the class had 367 people.
What is the probability that some of us have the same birthday?
- When you play 1 dollar on a number at roulette, say on number 17 or
whatever, your chance of winning is 1 in 38, and you get 36 dollars added.
If you lose, your 1 dollar is gone.
What is your expected value for this game?