Are you, in fact, a pregnant lady who lives in the apartment next door to Superdeath's parents? - Commodore

Create an account  

 
[signup thread, closed] WW49: Wizards and Werewolves

You might as well change your username to Omar at this point.
Reply

The first child having brown absolutely does give some indication that the wife is BB and not BG. Take it to an extreme to illustrate it. Suppose they already had 100 children who all had brown. You'd intuitively guess the wife is near-certainly BB. 1 child is less of an indicator than 100, but it's more of an indicator than none.
Reply

Haven't gotten the pen and paper yet, but I don't think T-Hawk's argument is correct. That argument is correct if we assume that we know nothing about the parameter p, or if parameter p is a random variable. But p isn't a random variable. From deduction and from the rules of inheritance, p = 1/3. From a Bayesian standpoint, the prior distribution of p is a degenerate distribution, so it equals the posterior distribution, regardless of what else we condition on. I think.

EDIT: Though I'm not sure if this depends on whether we take a frequentist interpretation of probability, or a Bayesian interpretation.

EDIT 2: At this rate, it might be faster to just do Monte Carlo simulations. Hmm. Is it just me or are we all stuck in the Monty Hall?

EDIT 3: Alright, I'm spinning in circles. T-Hawk and thrawn are right that I have to do Bayesianism, since the parameter that determines the child's genes, the mother's genes, is random itself. Hang on.
More people have been to Berlin than I have.
Reply

Curses, Omar, thrawn, and T-Hawk are right. It is 3/4. Stupid me, for forgetting that M, the mother's genes, is a random variable and thus has to be adjusted due to draws from its distribution. I should've learned not to argue against T-Hawk.
Let M be the mother's genes, the only parameter in the model. The prior distribution is 
P(M = BB) = 1/3
P(M = BG) = 2/3

Let X1 and X2 be the eye colors of the first and second child, respectively.

P(X1 = brown|M = BB) = 1
P(X1 = brown|M = BG) = 1/2

Now we need to condition on the information that X1 = brown.

P(M = BB|X1 = brown) = P(M = BB, X1 = brown) / [P(X1 = brown|M = BB)P(M = BB) + P(X1 = brown|M = BG)P(BG)] = (1/3) / [1 * 1/3 + 1/2 * 2/3] = 1/2

Dropping the X1 = brown conditioning from everything, even though it's there in the background:

P(X2 = brown) = P(X2 = brown|M = BB)P(M=BB) + P(X2 = brown|M = BG)P(M=BG) = 1*1/2 + 1/2*1/2 = 3/4.


EDIT: And in the general case, where you have n children with brown eyes, the probability that your next child will also have brown eyes should be the following:


[1 + (1/2)^n] / [1 + (1/2)^(n-1)]

which tends to 1 as n goes to infinity.
More people have been to Berlin than I have.
Reply

Now we have that little diversion solved, let's get back to our regular programming.

With pindi, I think we only need 1 more? That would make 11, and Bob will make it 12 if we can't fill it otherwise (which seems likely). Then we can randomize the remaining classes for those who don't want to choose, and leave the rest to Comm/Brick.
Reply

Of course, the first probability was for the first child to have brown eyes, I'm an idiot ! For the second one you guys are right.
I'll blame it on having done very little probabilities in my studies.
Reply

Relevant XKCD
Reply

(October 22nd, 2020, 02:27)sunrise089 Wrote: Relevant XKCD

When has XKCD failed to provide something relevant?
Reply

Okay I’ll play, on the condition that we lynch Pindicator tomorrow.
I have to run.
Reply

(October 22nd, 2020, 06:02)novice Wrote: Okay I’ll play, on the condition that we lynch Pindicator tomorrow.

lol
Reply



Forum Jump: