An MC has transition probability matrix
Find
(3.1.1 from K and P) A simplified model for the spread of a disease (Covid-19) goes this way: The total population size is $N=5$, of which some are diseased (have the disease) and some are healthy. The selection is such that an encounter between any one pair of individuals is just as likely between any other pair. At any time instance, let us assume that only one pair of individuals (randomly, uniformly selected) may meet. If one of these individuals is diseased and the other is not, then the disease is transmitted from the diseased to the healthy with probability $\alpha = 0.1$. Otherwise, no disease transmission takes place. Let $X_n$ denote the number of diseased persons in the population at the end of the nth period. Specify the transition probability of this Markov chain.
(3.4.4 from K and P) A coin is tossed repeatedly until two successive heads appear. Find the mean number of tosses required.
A red die is rolled a single time. A green die is rolled repeatedly. The game stops at the first roll where the sum of the two dice is either 4 or 7. What is the probability that the game stops with a sum of 4?
Suppose the craps dice are shaved on faces 1-6 such that the pmf of a single die is
where . Find the win probability using the method described in class (see also KP, section 2.2).
Let have a Poisson distribution with parameter . Suppose $λ$ itself is random, following an exponential density with parameter θ.
Recall Polya’s Urn. An Urn contains a red balls and b green balls to start with. A ball is drawn at random and it is returned to the urn along with another ball of the same color. Let be the fraction of red balls.
Let be iid rvs for . Let and . Show that $X_n$ defines a martingale, and determine the limit of $X_n$ as $X_n \to \infty$.
Suppose $X$, $Y$ and $Z$ are discrete random variables with a joint pmf . Show that
Hint: Start with the definition where
If is a martingale, compute in terms of . Hint: Start with
A Markov chain on states has the transition probability matrix
and initial distribution
Determine .
A Markov chain $X_0, X_1, X_2, \ldots$ has the transition probability matrix
Determine the conditional probabilities
Suppose $N$ is a geometric random variable taking values in $0,1,\ldots$ with parameter $p$. (By inspecting the range of $N$, we see that $N$ is in fact, a shifted geometric random variable that represents the number of failures before the first success.) Let
where ${\xi_i}$ are iid random variables independent of $N$, and $X = 0$ if $N = 0$. Find the cdf of $X$. Hint be mindful of the $N=0$ case. Assume that $\xi_i$ are iid $\Exponential(1)$. Simplify your answers as much as possible.
Solution:
Let $f$ be the density function of $\xi_{i}$. Then
We know that a sum of independent exponentials produces a Gamma distribution (see page 30, section 1.4.4 in Karlin and Pinsky 4th edition). Then, the density of
is
Let us find the cdf of $X$.
Now we have to divide the cases up into $t \geq 0$ and $t < 0$ since when $N = 0$, we have $X = 0$.
Thus, the only interesting case is when $t \geq 0$; we have
Putting this together, we get
Suppose that three contestants on a quiz show are each given the same question and that each answers it correctly, independently of the others with probability $p$. But the difficulty of the question is itself a random variable. So suppose that $p$ is $\Uniform(0,1)$. What is the probability that exactly two of the contestants answer correctly?
Solution:
Let $X$ be the number of contestants answering question correctly.
Let $X_0,X_1,\ldots$ be iid nonnegative random variables having a pdf. Let $N$ be the first index $k$ for which $X_k > X_0$. $N = 1$ if $X_1 > X_0$, $N=2$ if $X_1 \leq X_0, X_2 > X_0$ and so on. Find the pmf for $N$ and the mean $\E[N]$. (Interpretation: $X_0$, $X_1$ are bids on the car you are trying to sell. $N$ is the index of the first bid that is better than the initial bid.)
Solution:
Let $f$ and $F$ be pdf and cdf of $X_{i}$ resp.
Before we tackle the general $k \geq 2$ case, we define the tail of $X_0$ as $T(t) = 1 - F(t)$, and hence $dT = T’(t)dt = -f(t)dt$.
where we have used the last line to define the integral $I_k$. Now we evaluate the integral $I_k$ for $k \geq 2$ using integration by parts (when in doubt…).
and therefore moving the $(k-1)T_k$ to the LHS of the above equation
where we have used $I_1 = 1$. Thus
Then,
Suppose $X_1,X_2,X_3$ are discrete random variables. Show that
Solution:
We could also do this the following way for discrete $X_1,X_2$ and $X_3$. The conditional expectation is a FUNCTION of the values of $X_2,X_3$. So
using the law of total probability in the last equation.
Conditional dependence and independence.
Draw uniformly randomly from and then draw two independent samples and with distribution. Compute the (unconditional) correlation coefficient of and .
Let have a Poisson distribution with parameter . Suppose that, conditioned on , we draw a random variable . Let . Show that and have Poisson distributions with parameters and and that and are independent.
Let and be jointly distributed random variables whose joint probability mass function is given in the following table:
Show that the covariance between and is zero even though and are not independent.
Let $A_0, A_1, \ldots, A_n$ be events of non-zero probability.
Show that
Solution: Equation is true when $n=0$. Suppose equation is true for $n\leq m$. Then as $n=m+1$
Suppose $U$ has uniform distribution in $[0,1]$. Derive the density function for the random variables
Refer to section 1.2.6 in your textbook.
Solution: 1. As $U\sim \Uniform[0,1]$, range of $Y$ is [-$\infty$,0]. Given $t\leq 0$,
So $f_{Y}(t)=e^{t}$ for $t\in[-\infty,0]$ and zero elsewhere.
2.
Range of $W_{n}$ is $[0,1]$. Given $t\in[0,1]$,
\begin{equation}
\begin{split}
P(W_{n}\leq t)
&=P(U^{n}\leq t)
&=P(U\leq t^{1/n})
&=t^{\frac{1}{n}}
\end{split}
\end{equation}
So $f_{W_{n}}(t)=\frac{1}{n}t^{\frac{1}{n}-1}$ for $t\in[0,1]$ and zero elsewhere.
If $X \sim \Exponential(2)$ and $Y \sim \Exponential(3)$ are independent, find $\Prob(X < Y)$.
Solution:
A quarter is tossed repeatedly until a head appears. Let $N$ be the trial number when this head appears. Then, a dollar is tossed $N$ times. Let $X$ count the number of times the dollar comes up tails. Determine $P(X = 0)$, $P(X = 1)$ and $\E[X]$.
Solution:
Let $S=P(X=1)$, then
So $S=P(X=1)=\frac{4}{9}$. The way to see this in general is by defining
and then substituting $p=1/4$.
since $N$ is geometric taking values in ${1,\ldots,}$ and wikipedia says $E[N] = 1/p$.
In the above we used that if $Z \sim \Binomial(n,p)$ then
with $p = 1/2$.
Let $N$ have Poisson distribution with parameters $\lambda = 1$. Conditioned on $N = n$, let $X$ have a uniform distribution over the integers $0,1,\ldots,n+1$. What is the marginal distribution for $X$?
Solution
Do men have more sisters than women have? In a certain society, all married couples use the following strategy to determine the number of children that they will have: If the first child is a girl, they have no more children. If the first child is a boy, they have a second child. If the second child is a girl, they have no more children. If the second child is a boy, they have exactly one additional child. (We ignore twins, assume sexes are equally likely, and the sex of distinct children are independent random variables, etc.)
Solution
The probabilities for families are (1) , (2) , (3) , and (4) . We can use this to answer all the questions.
Let be the number of kids.
Let $G$ be the number of girls
In the total population, type 2 families contribute 1 boy, type 3 contribute 2 and type 3 contribute 3. So if you choose a male child at random, the probability that he is from a type 1, type 2 and type 3 family are respectively
where we have weighted each family by the number of boys they have produced.
Maybe this doesn’t convince the more rigorously minded (as it shouldn’t). Think of the next generation being as determined by families that are currently reproducing. Each child they produce is labeled with their gender and their type. Let be the number of individuals of type being produced by family . By the law of large numbers
If is the number of boys being produced by family $i$, then
The fraction of boys in the population satisfies
Then, for example, if you choose a boy at random, the probability that the boy comes from a type 2 family is
as we obtained above.
Let be the number of sisters, and clearly . Then
and
Let be the number of brothers so depending on if it’s a type (2), (3), or (4) family. Similar to what we did for part 3,
As a bonus question, choose a random individual from the population. What is the probability that they are male?
Let be the random child selected from the population. We want to find the probability that this child is a boy. There are a few ways to understand it.
a. In each family, we’ve assumed that both sexes are equally likely. . So each time a child is added to the population, there is an equal chance it is a boy or a girl. So the proportion of the number of boys or the number of girls must be half, and hence .
b. Think of the next generation being as determined by $M$ families that are currently reproducing. Let $B_i$ be the number of boys family $i$ produces and let $G_i$ be the number of girls. Then $B_i + G_i$ has the same distribution as $N$ listed above. The total number of boys produced by the $M$ families satisfies
And similarly,
Therefore, the proportion of boys, as is
Interesting, huh?
Let $N$ cards carry the distinct numbers $x_1,\ldots,x_N$. If two cards are drawn at random without replacement, show that the correlation coefficient $\rho$ between the numbers appearing on the two cards in $-1/(N-1)$.
Solution
Let $X,Y$ be the number on the first and second cards respectively.
So we know $E[X]=E[Y]$ and $E[X^{2}]=E[Y^{2}]$. Let
Then
We know $\rho$ is the correlation coefficient of $X,Y$, the formula of $\rho$ is
Also we know
There is only $\E[XY]$ left to compute.
Bring those formulas back to $\rho$, we have
Let $X$ and $Y$ be independent random variables having distribution $F_x$ and $F_y$ respectively.
Solution
Since $X,Y$ are independent,
since if $\max(X,Y) \leq t$ then both $X \leq t$ and $Y \leq t$.
Let $U$ be Poisson distributed with parameter $\lambda$ and let $V = 1/(1+U)$. Find the expected value of $V$.
Solution
Let $U,V,W$ be independent random variables with equal variances $\sigma^2$. Let $X = U + V$ and let $Y = V - W$. Find the covariance of $X$ and $Y$.
Solution
As $U,V,W$ are independent,
Another way to do this is to write
Homework will be a little long this week, since it is mostly review.
Which topics about stochastic processes most interest you? (E.g. Brownian motion, stock market modeling, statistical physics, etc.)
Given that you end up working as hard as you think you can work this semester, what grade do think you can achieve in this class? How important is the letter grade to you?
Would you prefer a challenging class at a good pace so you can learn a lot, or do you prefer an easy-paced class so you can learn the material well?
Of the three general principles for modeling probabilities stated in K&P chapter 1, in your opinion, which one applies best to the modeling of share prices in the stock market?
Plot the distribution function
Solution: The density function is
Determine the distribution function, mean and variance corresponding to the triangular density
Solution
Let $1_{A}$ be the indicator random variable associated with an event $A$, defined to be one if $A$ occurs, and zero otherwise. Show
Solution:
If $\omega \in A^{c}$, then
If $\omega \in A$, then
So $1_{A^{c}}=1-1_{A}$
If $\omega\in A\cap B$, then
If $\omega \notin A\cap B$, then $\omega$ is either in $X-(A\cup B)$, $A-B$ or $B-A$, we have
So
Note that $\min(1_{A},1_{B})$ has only two possible values ${1,0}$.
So
Note that $\max(1_{A},1_{B})$ has only two possible values ${1,0}$.
A pair of dice is tossed. If the two outcomes are equal, the dice are tossed again, and the process is repeated. If the dice is unequal, their sum is recorded. Determine the probability mass function for the sum.
Solution:
There are only 30 possible outcomes. Among 30 outcomes, there are 2 outcomes of having 3, 4, 10 and 11 as their sum. 4 outcomes of having 5,6,8 and 9 as sum. And 6 outcomes of having 7 as sum.
Define $X$= sum of two dice. Then sample space is ${3,4,5,6,7,8,9,10,11}$
And we have the following probability