Monty Hall Problem Code to Proof

Started by
25 comments, last by irreversible 7 years, 2 months ago

So if you want to follow the strategy ten times, you need $4096 dollars on hand for your guaranteed win of $1. If it required 15 spins, then you'd need $65,536.

I did not say that deep pockets were not required :)

Advertisement

For the classic H/T coin toss there is NO difference in probability between 100H in a row and alternating H, T, H, T, ...... as each toss is a new instance so what has gone before has no effect, you might be recording what has gone before but that does not effect the future.

Yes, each toss is a new instance but if you are looking at multiple random events, even if they are independant, multiplication-law still applies, which means that your chance to see not a single H in X tosses is still lower. This can be easily visualized by looking at the possible outcomes of say 3 coinflips:

HHH

HHT

HTT

HTH

THH

THT

TTH

TTT

All of those outcomes are equally likely, but there is only one here that doesn't have a single head. So your chance of flipping three coins and having no H is 1/8, and so on.

Now if you are at 3 tails and want to know whats the possiblity for another tail is, it is 50/50, yeah, and thats why doubling doesn't work in any case. But I was mainly talking about the multiplication of multiple idenpenant events, which i demonstrated here.

So you are right that at 100 T, you are equally likely to get another T, but you are not likely to even get to 100 Ts in a row.

Multiple independent events are different before and after they have happened though :) The change of HH before you flip any coins is 1 in 4 but after you have flipped one coin and got 1H it becomes a 50/50 flip. This was the point I was trying to make, depends on when you make the observation and the question you ask. And I am correct that 100H is the same change as an exact HTHTHT.. sequence. The chance of getting say 90H and 10T in any order though is a very different number :)

I think we are actually on the same sheet in our view. What the OP of the topic failed to see with monty hall, it changes the rules part way through based on previous selections and that effects the probability.

As others have said, the two decisions are not independent, so the "intuitive" 50/50 answer doesn't work - the dependency prevents the simple reduction to a 2-door position. Remember, if probability was intuitive casinos would all go broke over night.

I like Juliean's first reply and exhausting every possibility, but let's try to understand it by a different approach and change the rules slightly. In my new game, you still make you first choice of 3 doors, but this time Monty wont open a wrong door of the other two. Instead, you can choose to open your first door, or 'switch' and open both of the others at once. If you switch and the car is behind either of the other doors, you win. Clearly you expect to win 66% of the time if you always switch. But at least one of the 2 doors will always have a goat behind it since there's still only one car. If Monty opens that door for you (making it like the original game), you're odds don't change.

I'm taking some graduate courses, currently, one on computer performance modeling. In some of the review questions for probability, we did this one which reminded me very much of the Monty Hall problem.

In genetics, some traits are recessive and dominant. For example, eye color has a dominant trait of brown over blue. Because two genes determine eye color, so long as one of them is brown, the person will have brown eyes. Only if both are blue will they have blue eyes. A child inherits one gene from each parent with equal likelihood.

Suppose John has brown eyes and his parents have brown eyes. But his sister has blue eyes. Further, suppose John has a child with his blue-eyed wife. That child has brown eyes. What is the probability the next child will have brown eyes?

For Monty Hall, simple enumeration of all equal possibilities gives a straightforward explanation:

If P represents the initial door you pick, G = goat, C = car. Here's the full table with outcome in the right column for the strategy of always switching:


// Door 1     Door 2      Door 3      Outcome
// PG         G           C           Win
// G          PG          C           Win
// G          G           PC          Lose
// PG         C           G           Win
// G          PC          G           Lose
// G          C           PG          Win
// PC         G           G           Lose
// C          PG          G           Win
// C          G           PG          Win

Nine possibilities of which six are winners, hence 2/3.

I'm taking some graduate courses, currently, one on computer performance modeling. In some of the review questions for probability, we did this one which reminded me very much of the Monty Hall problem.
In genetics, some traits are recessive and dominant. For example, eye color has a dominant trait of brown over blue. Because two genes determine eye color, so long as one of them is brown, the person will have brown eyes. Only if both are blue will they have blue eyes. A child inherits one gene from each parent with equal likelihood.
Suppose John has brown eyes and his parents have brown eyes. But his sister has blue eyes. Further, suppose John has a child with his blue-eyed wife. That child has brown eyes. What is the probability the next child will have brown eyes?


Is it 75%? If it's not, don't give me the answer. I can try to do it again more carefully.

Is it 75%? If it's not, don't give me the answer. I can try to do it again more carefully.

You got it. Someone else mentioned the intuition behind these problems. Imagine if they had 100 kids, all with brown eyes. It would be highly unlikely that the next one will be blue. So at each new kid the chances of brown increase while the chances of blue decrease.

I usually find it's more intuitive when you scale up the number of doors:

Imagine 100 doors, 99 of them are duds and 1 of them wins the game.

This is indeed the most ituitive way to think about. Although, playing Devil's advocate, I do have to concede that the original problem is usually presented ambiguously in that if you scale it up, the assumption is that only one door out of a hundred is the winning one. Making this assumption, IMHO, is the thing that tends to be unintuitive to most people. Looking at the Wikipedia postulation:

Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what's behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?

This states that one out of three doors is the correct one. Based on this information alone, scaling the problem to 100 doors does not implicitly scale the distribution from the original 33% to 1%, but rather it is also valid (and more intuitive) to assume that now ~66 doors will have goats behind them.

It would be an easy fix if the original problem said "behind only one door is a car".

</rant>

This topic is closed to new replies.

Advertisement