\( \newcommand{\matr}[1] {\mathbf{#1}} \newcommand{\vertbar} {\rule[-1ex]{0.5pt}{2.5ex}} \newcommand{\horzbar} {\rule[.5ex]{2.5ex}{0.5pt}} \newcommand{\E} {\mathrm{E}} \)
deepdream of
          a sidewalk
Show Question
\( \newcommand{\cat}[1] {\mathrm{#1}} \newcommand{\catobj}[1] {\operatorname{Obj}(\mathrm{#1})} \newcommand{\cathom}[1] {\operatorname{Hom}_{\cat{#1}}} \newcommand{\multiBetaReduction}[0] {\twoheadrightarrow_{\beta}} \newcommand{\betaReduction}[0] {\rightarrow_{\beta}} \newcommand{\betaEq}[0] {=_{\beta}} \newcommand{\string}[1] {\texttt{"}\mathtt{#1}\texttt{"}} \newcommand{\symbolq}[1] {\texttt{`}\mathtt{#1}\texttt{'}} \newcommand{\groupMul}[1] { \cdot_{\small{#1}}} \newcommand{\groupAdd}[1] { +_{\small{#1}}} \newcommand{\inv}[1] {#1^{-1} } \newcommand{\bm}[1] { \boldsymbol{#1} } \require{physics} \require{ams} \require{mathtools} \)
Math and science::INF ML AI

The urns

Problem statement

There are 11 urns labeled by \( u \in \{0, 1, 2, ... 10\} \), each containing ten balls. Urn \( u \) contains \( u \) black balls and \( 10 -u \) white balls. Fred selects an urn \( u \) uniformly at random and draws \( N \) times with replacement from that urn, obtaining \( n_B \) blacks and \( N-n_B \) whites. Fred's friend, Bill, looks on. If after \( N=10 \) draws \( n_B = 3 \) blacks have been drawn, what is the probability that the urn Fred is using is urn \( u \) from Bill's point of view?

Solution

Firstly, we know how to express the probability distribution of the number of blacks given the urn and the total ball count:

\[ P(n_B | u, N) = \left(\begin{array}{c}N\\ n_B\end{array}\right) f_u^{n_B}(1-f_u)^{N-n_B} \quad \text{(1)}\]

Where we define \( f_u := \frac{u}{10} \).

Adding the \( u \) dimension creates the joint probability distribution of the random variables \( u \) and \( n_b \), \( P(u, n_B | N) \). It can be written as:

\[ P(u, n_B | N) = P(n_B | u, N) P(u) \]

Notices how \( P(u) \) appears, the probability that Fred selects a given urn, which is uniform at \( \frac{1}{11} \) as per the question.

From here, we need Bayes Theorem, which is:

\[ P(A|B) = \frac{P(A , B)}{P(B)} = \frac{P(B|A)P(A)}{P(B)} \]

From the joint probability of \( u \) and \( n_B \) we can obtain the conditional distribution of \( u \) given \( n_B \) using Bayes Theorem:

\[ \begin{align*} P(u|n_B, N) &= \frac{P(u, n_B |N)}{P(n_B|N)} \\ &= \frac{P(n_B | u, N)P(u)}{P(n_B|N)} \end{align*} \]

We so far know 2 of the 3 expressions in that quotient. The third, the denominator, \( P(n_B|N) \), is the marginal probability of \( n_B \) which we can obtain using the sum rule:

\[ P(n_B | N) = \sum_u P(u, n_B | N) = \sum_u P(u)P(n_B | u, N) \]

So the conditional probability of \( u \) given \( n_B \) is:

\[ \begin{align*} P(u|n_B, N) &= \frac{P(n_B | u, N) P(u)}{\sum_{u'} P(n_B | u', N)P(u')} \\ &= \frac{P(n_B | u, N) P(u)}{P(n_B| N)} \\ &= \frac{1}{P(n_B|N)} \frac{1}{11} \left(\begin{array}{c}N\\ n_B\end{array}\right) f_u^{n_B}(1-f_u)^{N-n_B} \\ \end{align*} \]

We must calculate and sum all probabilities of the form (1) above, then p the sum by the probabilities from (1) for the given urn.

By the way, the sum for when \( n_B \) is 3 is: \( P(n_B = 3|N=10) = 0.083 \).

Todo: compare this to the bent coin.



Source

David MacKay