MTH 202 Homework


Homework

HW 12 due Apr 24

Problem 1

For each value of , let have a Poisson distribution with parameter . Let for Verify that

Here stands for any remainder term of order less than as .

Problem 2

Shocks occur to a system according to a Poisson process of rate . Suppose that the system survives each shock with probability α, independently of other shocks, so that its probability of surviving shocks is . What is the probability that the system is surviving at time t?

Problem 3

Determine numerical values to three decimal places for , , when

(a) has a binomial distribution with parameters and .

(b) has a binomial distribution with parameters and .

(c) has a Poisson distribution with parameter .

Problem 4

Suppose that 100 tags, numbered , are placed into an urn, and 10 tags are drawn successively, with replacement. Let A be the event that no tag is drawn twice. Show that

Use the approximation for to get

The law of rare events says that if

where , then let where , then

Find

  1. The appropriate values for the parameters , that correctly represents our problem.
  2. Identify $Y$ and the approximate the probability of the event $A$ you were asked to compute above using $Y$.
  3. Identify the worst-case error in the law of rare events for this particular problem.

HW 11 due Apr 17

Problem 1

Consider the Markov chain on whose transition probability matrix is

  1. Verify that is a stationary distribution.

  2. Show that the first return distribution to state is given by and for .

  3. Calculate the mean return time and verify that .

Problem 2

Let be a probability distribution, and consider the Markov chain whose transition probability matrix is

What condition on the probability distribution is necessary and sufficient in order that a limiting distribution exist, and what is this limiting distribution? Assume and so that chain is aperiodic.

Problem 3

Given the transition matrix

  1. Draw a picture of the Markov chain, and determine the communication classes.

  2. Determine the period(s) of the communication class(es).

  3. Determine the limits as of for .

Hint: Read section 4.5 of the textbook

Problem 4

A Markov chain on states 0,1,\ldots has transition probabilities

Find the stationary distribution.

Problem 5

  1. Consider the reflected random walk on (if you’re at state , then you go to state with probability one), with probability of going forward, and probability of going backwards. If , determine the probability that you return to the origin.

  2. Now with , determine the stationary distribution. What is the mean return time to given that you start at state ? If start in state , let be the first hitting time of . Determine .

HW 10 due Apr 10

Problem 1

Consider the Markov Chain given by the transition matrix

on the state space $0,1,2,3$. Recall the first return pmf $f_{ii}^{(n)}$ defined in the textbook and in our notes. Using the equation

determine for .

Problem 2

Let be a random variable taking values in the nonnegative integers. Show that

Problem 3

An individual either drives their car or walks in going from their home to their office in the morning, and from their office to their home in the afternoon. They use the following strategy: If it is raining in the morning, then they drive the car, provided it is at home to be taken. Similarly, if it is raining in the afternoon and their car is at the office, then they drive the car home. They walk on any morning or afternoon that it is not raining or if the car is not where they are. Assume that, independent of the past, it rains during successive mornings and afternoons with constant probability p. In the long run, on what fraction of days does our person walk in the rain? What if they own two cars?

Problem 4

  1. Show that if $i$ and $j$ communicate, then $d(i) = d(j)$.

  2. Show that a finite, irreducible, aperiodic Markov chain is recurrent.

Problem 5

Problem of the points.

Blaise Pascal and Pierre Fermat discussed the problem of the points in 1654. Suppose you are your friend have contributed a sum S to a pot of money, and are playing until the first person wins $n$ rounds of a game. Assume that it is a game of chance and skill, but you and your friend are equally likely to win each round. Suppose the game stops after $r$ rounds due to an earthquake. Then, you must divide the pot $S$ fairly based on the fact that you have won $a < n$ rounds, and your friend has won $r - a < n$. Give a formula for the amount of money you should get in terms of $r, n$ and $a$.

HW 9 due Apr 03

Problem 1

Suppose $\{a_i\}$ is a sequence such that $a_i \to a$ as $i \to \infty$. Show that the Cesaro mean converges to $a$ as well:

I expect an proof

Solution Since , for every , there is a such that for all , we have

So fix this and let . Then,

As , the first term goes to in the above since is fixed. So there is an such that for all , we have

Combining these two estimates, we get for ,

Since was arbitrary, we are done.

Problem 2

  1. Determine the long run / limiting distribution for the Markov chain whose transition probability matrix is

    Solution:

  2. (4.1.11) Suppose a production process changes states according to a Markov Process whose transition probability matrix is given by

    1. Determine the limiting probabilities and given that and
    2. Suppose that states and are “in-control” while states and are deemed out of control. In the long run, what fraction of time is the process out of control?
    3. In the long run, what fraction of transitions are from an in-control state to an out of control state.

    Solution:

    From transition probability matrix

    $\Rightarrow \pi_{0}=\frac{117}{379}, \pi_{3}=\frac{62}{379}$

    $\pi_{2}+\pi_{3}=\frac{143}{379}$ is the fraction of time being out of control.

    Let $a_{k}=P\{X_{k}=0, X_{k+1}=2\}+P\{X_{k}=0, X_{k+1}=3\}+P\{X_{k}=1, X_{k+1}=2\}+P\{X_{k}=1, X_{k+1}=3\}$

    Then, using conditional probability, $a_{k}=p_{02}P(X_{k}=0)+p_{03}P(X_{k}=0)+p_{12}P(X_{k}=1)+p_{13}P(X_{k}=1)$

    By problem 1.1, we have the answer be $\frac{591}{3790}$.

Problem 3

Consider the simple random walk with probabilities $p$ and $q = 1 - p$ of going forwards or backwards on the state space $1,2,3,4$. The states $1$ and $4$ are absorbing.

  1. What are the communication classes?
  2. What is the periodicity of each communication class?
  3. Let be a transient class in this system. Show that for any in the communication class.

Solution

Classes: $\{1\}, \{2,3\}, \{4\}$.

State 1 and 4 are absorbing states. $d=1$ $\{2,3\}: d=2$.

Order the states as $1,4,2,3$ and write

where $0$ and $I$ represent the $2 \times 2$ $0$-matrix and identity matrix respectively, and

For $n=2k$,

For $n=2k+1$,

${2,3}$ is the transient class since $\sum_{n}P_{22}^{n}<\infty$. Take $n\rightarrow\infty$ in $Q^{(n)}$, we have the conclusion.

Problem 4

(Problem 4.3.3 from K&P)

A Markov chain on states ${0, 1, 2, 3, 4, 5}$ has transition probability matrix

(a)

(b)

Find all communicating classes; which classes are transient and which are recurrent?

Problem 5

Show that a finite chain that is aperiodic and irreducible is regular and recurrent. I expect a proper proof from the definitions.

Solution We need to show that there is an such that for all . Let be our finite state space. Now for each state , we showed, since the period , that

is finite. Let be the largest number in this finite set. Then, for , . Now do this for each state , and let . Then for , for every . So we have that the diagonal entries of this matrix are all positive.

Fix and . Then by irreducibility, there is an such that . By the Chapman-Kolmogorov equation, for every , we have

This can be done for every pair of states , and let . Therefore, for

This shows that is regular (and some).

HW 8 due Mar 27

Problem 1

Suppose I start a chain on $N$ states with $P(X_0 = 1) = \pi_1,P(X_0 = 2) = \pi_2,\ldots$ where $\pi = (\pi_1,\ldots,\pi_N)$ is the stationary probability satisfying $\pi = \pi P$, where $P$ is a regular transition probability. What is $P(X_k = 1),P(X_k = 2),\ldots$ for fixed $k > 1$. Hint: first do the case $k=1$.

Solution We know $\pi=\pi P$.

Suppose $\forall k\leq n$,we have $\pi=\pi P^{(n)}$. Then

Thus, $\pi=\pi P^{(n)}, \forall n\in\mathbb{N}$. As initial probability is $\pi$,

Problem 2

A Markov chain has transition probability

Determine its limiting distribution. What proportion of time, in the long run, does it spend in each state?

Solution:

Let $\pi=(\pi_{0}, \pi_{1}, \pi_{2})$ be its limiting distribution.

Problem 3

Consider the gambler’s ruin problem on states $0,\ldots,N$, with probability $p$ of going from state $i$ to $i+1$ and probability $q$ of going from state $i$ to $i-1$ for $i = 1,\ldots,N-1$. Fix a state $k$, where $0 < k < N$ and let $W_{ik}$ be the mean total visits to state $k$ starting from $i$ (before being absorbed into state $0$ or $N$. Formally, the definition is

where $1_A$ the usual indicator rv of the event $A$. Show that

for $i = 1,\ldots,N-1$ and $W_{0k} = W_{Nk} = 0$. $\delta_{ik}$ is $1$ if $k=i$ and $0$ otherwise.

You may try to solve the recurrence for fun!

Solution:

When $i\neq k$ and $i\neq 0$,

Using the Tower property,

Problem 4

Determine the following limits in terms of the transition probability matrix $P$ and the limiting distribution $\pi$ of a finite-state regular (irreducible, aperiodic) Markov chain .

Solution

  1. , so

  2. We have

  3. Again, making a change of variable , we get

Problem 5

Consider a Markov chain with probability

Assume $0 < p_0 < 1$, $\sum_{i=0}^N p_i = 1$ and find the limiting distribution.

HW 7 due Mar 20

Problem 1

A Markov chain has the transition probability matrix

with states labeled . It starts in state . Eventually, the process will end up in state 2. What is the probability that the time is an odd number?

Solution

For $i \in {1, 2}$ let $u_i = P[T \text{ is odd } | X_o = i]$ and $v_i = P[T \text{ is even } | X_o = i]$ which mean that

So, we can get the system of equations

Substituting in $v_i = 1 - u_i$ we get that

which gives us

which we can solve to get $u_0 \approx 0.67669$ and $u_1 \approx 0.6015$ making the answer $\approx 0.67669$.

Problem 2

In a Markov chain, let and suppose is the set of absorbing states. Consider the absorption time . Assume

We can prove this later, but we will have to make assumptions about the reachability of the absorbing states. Otherwise, we could be in a situation where from some state , it is just not possible to get absorbed and thus .

We are going to prove the equation we derived in class more rigorously. For all

where . In matrix form, this is the equation .

  1. Fix . Show

  2. Show that for , if and otherwise.

  3. Show that for $j > 1$, if $k \not\in A$

    This lies at the heart of the proof, and the second equality should be shown using the Markov property and time homogeneity.

  4. Substitute the result of into and show

    and use this to complete the proof.

Solution

Part 1

We get the first equality from the definition of conditional expectation and the fact that zero times anything is zero (so we can start the summation at one). For the second, we use the law of total probability to see that

where, in the last line, we’ve used the Markov property. If you substitute this in, you get the desired result.

Part 2

We note that $T = j$ means that (1) $X_j \in A$ and (2) $X_i \notin A$ for all $i < j$. For $j = 1$, since we are assuming that $X_0 \notin A$ (by definition of $v_i$), the second condition is satisfied automatically for $j = 1$. This means that $T = 1 \Leftrightarrow X_1 \in A$ i.e. $P(T = 1 | X_1 = k) = 1$ for $k \in A$ and $0$ otherwise.

Part 3

Again, by definition, $T = j$ means that (1) $X_j \in A$ and (2) $X_i \notin A$ for all $i < j$. This means that

In the first line we use the definition of $T = j | X_1 = k$. In the second line, we use the fact that since $A$ is the set of absorbing states, $X_1 \notin A$ means that $X_0 \notin A$ automatically. In the third line, we use time homogeneity and, in the fourth line we use the definition of $T = j - 1 | X_0 = k$.

Part 4

Plugging this in we get

We use the fact that and

Problem 3

An urn contains five tags, of which three are red and two are green. A tag is randomly selected from the urn and replaced with a tag of the opposite color. This continues until only tags of a single color remain in the urn. Let denote the number of red tags in the urn after the nth draw, with . What is the probability that the game ends with the urn containing only red tags?

Solution

We count the number of red tags at each time so $X_n \in {0, 1, 2, 3, 4, 5}$. The associated matrix is

  0 1 2 3 4 5
0 1 0 0 0 0 0
1 1/5 0 4/5 0 0 0
2 0 2/5 0 3/5 0 0
3 0 0 3/5 0 2/5 0
4 0 0 0 4/5 0 1/5
5 0 0 0 0 0 1

Let $u_i = P(X_T = 5 | X_0 = i)$ for $i = 1, 2, 3, 4$ and so we have the vector

where

which gets us (using a calculator)

which makes the answer .

Problem 4

  1. Consider the two state Markov chain with and , where . Let be its transition matrix. Show (by induction) that

  2. Determine for whose transition probability matrix is

    Also determine .

Solution We have that

which you can check directly. This gives us the base case, $n = 1$.

Assume this is true for $n$. So we take

Part 2

The second term goes to zero (since ) and so, in general, we get

which, in our case, is

HW 6 due Mar 6

Problem 1 (Karlin and Pinsky, 4th ed, 3.1.12)

An MC has transition probability matrix

Find

  1. $\Prob(X_2 = 1, X_1 = 1 | X_0 = 0)$
  2. If the initial probability vector is $(\Prob(X_0 = 0), \Prob(X_0 = 1), \Prob(X_0 = 2)) = (\frac12,\frac13,\frac16)$, find $\Prob(X_2 = 1, X_1 = 1, X_0 = 0)$
  3. Find the vector $(\Prob(X_2 = 0), \Prob(X_2 = 1), \Prob(X_2 = 2))$ using matrix multiplication (and the same initial probability vector from the previous part).

Solution

Problem 2

(3.1.1 from K and P) A simplified model for the spread of a disease (Covid-19) goes this way: The total population size is $N=5$, of which some are diseased (have the disease) and some are healthy. The selection is such that an encounter between any one pair of individuals is just as likely between any other pair. At any time instance, let us assume that only one pair of individuals (randomly, uniformly selected) may meet. If one of these individuals is diseased and the other is not, then the disease is transmitted from the diseased to the healthy with probability $\alpha = 0.1$. Otherwise, no disease transmission takes place. Let $X_n$ denote the number of diseased persons in the population at the end of the nth period. Specify the transition probability of this Markov chain.

Solution

At some instant in time, say there are $k$ infected individuals. There are $\binom{5}{2}$ many pairs, and one among these is uniformly selected. For there to be $k+1$ infected individuals, an infected and an uninfected individual must meet, and disease transfer must happen (probability $\alpha$). Thus the probability of going from $k$ to $k+1$ individuals is for ($k = 1,2,3,4$)

Transition matrix is

Problem 3

(3.4.4 from K and P) A coin is tossed repeatedly until two successive heads appear. Find the mean number of tosses required.

  1. Define a Markov chain (let be the MC that represents the current number of successive heads that have appeared) and find its transition matrix.
  2. Condition on the first-step and compute the mean number of tosses required.

Solution The state space is 0,1,2 and the transition probability matrix is

Let $T=\inf\{n: X_{n}=2\}$ and $g_{i}=E[T \mid X_{0}=i]$

$\Rightarrow g_{0}=6$.

Problem 4

  1. A red die is rolled a single time. A green die is rolled repeatedly. The game stops at the first roll where the sum of the two dice is either 4 or 7. What is the probability that the game stops with a sum of 4?

  2. Suppose the craps dice are shaved on faces 1-6 such that the pmf of a single die is

    where . Find the win probability using the method described in class (see also KP, section 2.2).

HW 5 Due Wed Feb 27

Problem 1

  1. Let have a Poisson distribution with parameter . Suppose $λ$ itself is random, following an exponential density with parameter θ.

    1. What is the marginal distribution of X?
    2. Determine the conditional density for λ given X = k.
  2. Recall Polya’s Urn. An Urn contains a red balls and b green balls to start with. A ball is drawn at random and it is returned to the urn along with another ball of the same color. Let be the fraction of red balls.

    1. Show that is a martingale.
    2. Simulate and draw graphs of its various realizations for different values of and .
    3. Show, numerically, that converges to some random number . Draw pictures of the distribution of for different initial data and .
  3. Let be iid rvs for . Let and . Show that $X_n$ defines a martingale, and determine the limit of $X_n$ as $X_n \to \infty$.

Problem 2

  1. Suppose $X$, $Y$ and $Z$ are discrete random variables with a joint pmf . Show that

    Hint: Start with the definition where

  2. If is a martingale, compute in terms of . Hint: Start with

Problem 3

  1. A Markov chain on states has the transition probability matrix

    and initial distribution

    Determine .

  2. A Markov chain $X_0, X_1, X_2, \ldots$ has the transition probability matrix

Determine the conditional probabilities

HW 4 Due Fri Feb 20

Problem 1

  1. Suppose $N$ is a geometric random variable taking values in $0,1,\ldots$ with parameter $p$. (By inspecting the range of $N$, we see that $N$ is in fact, a shifted geometric random variable that represents the number of failures before the first success.) Let

    where ${\xi_i}$ are iid random variables independent of $N$, and $X = 0$ if $N = 0$. Find the cdf of $X$. Hint be mindful of the $N=0$ case. Assume that $\xi_i$ are iid $\Exponential(1)$. Simplify your answers as much as possible.

    Solution:

    Let $f$ be the density function of $\xi_{i}$. Then

    We know that a sum of independent exponentials produces a Gamma distribution (see page 30, section 1.4.4 in Karlin and Pinsky 4th edition). Then, the density of

    is

    Let us find the cdf of $X$.

    Now we have to divide the cases up into $t \geq 0$ and $t < 0$ since when $N = 0$, we have $X = 0$.

    Thus, the only interesting case is when $t \geq 0$; we have

    Putting this together, we get

  2. Suppose that three contestants on a quiz show are each given the same question and that each answers it correctly, independently of the others with probability $p$. But the difficulty of the question is itself a random variable. So suppose that $p$ is $\Uniform(0,1)$. What is the probability that exactly two of the contestants answer correctly?

    Solution:

    Let $X$ be the number of contestants answering question correctly.

  3. Let $X_0,X_1,\ldots$ be iid nonnegative random variables having a pdf. Let $N$ be the first index $k$ for which $X_k > X_0$. $N = 1$ if $X_1 > X_0$, $N=2$ if $X_1 \leq X_0, X_2 > X_0$ and so on. Find the pmf for $N$ and the mean $\E[N]$. (Interpretation: $X_0$, $X_1$ are bids on the car you are trying to sell. $N$ is the index of the first bid that is better than the initial bid.)

    Solution:

    Let $f$ and $F$ be pdf and cdf of $X_{i}$ resp.

    Before we tackle the general $k \geq 2$ case, we define the tail of $X_0$ as $T(t) = 1 - F(t)$, and hence $dT = T’(t)dt = -f(t)dt$.

    where we have used the last line to define the integral $I_k$. Now we evaluate the integral $I_k$ for $k \geq 2$ using integration by parts (when in doubt…).

    and therefore moving the $(k-1)T_k$ to the LHS of the above equation

    where we have used $I_1 = 1$. Thus

    Then,

Problem 2

  1. Suppose $X_1,X_2,X_3$ are discrete random variables. Show that

    Solution:

    We could also do this the following way for discrete $X_1,X_2$ and $X_3$. The conditional expectation is a FUNCTION of the values of $X_2,X_3$. So

    using the law of total probability in the last equation.

Problem 3

Conditional dependence and independence.

  1. Draw uniformly randomly from and then draw two independent samples and with distribution. Compute the (unconditional) correlation coefficient of and .

  2. Let have a Poisson distribution with parameter . Suppose that, conditioned on , we draw a random variable . Let . Show that and have Poisson distributions with parameters and and that and are independent.

  3. Let and be jointly distributed random variables whose joint probability mass function is given in the following table:

Show that the covariance between and is zero even though and are not independent.

HW 3 Due Fri Feb 13

Problem 1: Conditional probability redux.

Let $A_0, A_1, \ldots, A_n$ be events of non-zero probability.

Show that

Solution: Equation is true when $n=0$. Suppose equation is true for $n\leq m$. Then as $n=m+1$

Problem 2

  1. Suppose $U$ has uniform distribution in $[0,1]$. Derive the density function for the random variables

    1. $Y = \log(1-U)$
    2. $W_n = U^n$ for $n \geq 1$

    Refer to section 1.2.6 in your textbook.

    Solution: 1. As $U\sim \Uniform[0,1]$, range of $Y$ is [-$\infty$,0]. Given $t\leq 0$,

    So $f_{Y}(t)=e^{t}$ for $t\in[-\infty,0]$ and zero elsewhere.

    2. Range of $W_{n}$ is $[0,1]$. Given $t\in[0,1]$, \begin{equation} \begin{split} P(W_{n}\leq t) &=P(U^{n}\leq t)
    &=P(U\leq t^{1/n})
    &=t^{\frac{1}{n}} \end{split} \end{equation} So $f_{W_{n}}(t)=\frac{1}{n}t^{\frac{1}{n}-1}$ for $t\in[0,1]$ and zero elsewhere.

  2. If $X \sim \Exponential(2)$ and $Y \sim \Exponential(3)$ are independent, find $\Prob(X < Y)$.

    Solution:

Problem 3

  1. A quarter is tossed repeatedly until a head appears. Let $N$ be the trial number when this head appears. Then, a dollar is tossed $N$ times. Let $X$ count the number of times the dollar comes up tails. Determine $P(X = 0)$, $P(X = 1)$ and $\E[X]$.

    Solution:

    Let $S=P(X=1)$, then

    So $S=P(X=1)=\frac{4}{9}$. The way to see this in general is by defining

    and then substituting $p=1/4$.

    since $N$ is geometric taking values in ${1,\ldots,}$ and wikipedia says $E[N] = 1/p$.

    In the above we used that if $Z \sim \Binomial(n,p)$ then

    with $p = 1/2$.

  2. Let $N$ have Poisson distribution with parameters $\lambda = 1$. Conditioned on $N = n$, let $X$ have a uniform distribution over the integers $0,1,\ldots,n+1$. What is the marginal distribution for $X$?

    Solution

Problem 4

Do men have more sisters than women have? In a certain society, all married couples use the following strategy to determine the number of children that they will have: If the first child is a girl, they have no more children. If the first child is a boy, they have a second child. If the second child is a girl, they have no more children. If the second child is a boy, they have exactly one additional child. (We ignore twins, assume sexes are equally likely, and the sex of distinct children are independent random variables, etc.)

  1. What is the probability distribution for the number of children in a family?
  2. What is the probability distribution for the number of girl children in a family?
  3. A male child is chosen at random from all of the male children in the population. What is the probability distribution for the number of sisters of this child?
  4. What is the probability distribution for the number of brothers of the previously chosen male child?

Solution

The probabilities for families are (1) , (2) , (3) , and (4) . We can use this to answer all the questions.

  1. Let be the number of kids.

  2. Let $G$ be the number of girls

  3. In the total population, type 2 families contribute 1 boy, type 3 contribute 2 and type 3 contribute 3. So if you choose a male child at random, the probability that he is from a type 1, type 2 and type 3 family are respectively

    where we have weighted each family by the number of boys they have produced.

    Maybe this doesn’t convince the more rigorously minded (as it shouldn’t). Think of the next generation being as determined by families that are currently reproducing. Each child they produce is labeled with their gender and their type. Let be the number of individuals of type being produced by family . By the law of large numbers

    If is the number of boys being produced by family $i$, then

    The fraction of boys in the population satisfies

    Then, for example, if you choose a boy at random, the probability that the boy comes from a type 2 family is

    as we obtained above.

    Let be the number of sisters, and clearly . Then

    and

  4. Let be the number of brothers so depending on if it’s a type (2), (3), or (4) family. Similar to what we did for part 3,

  5. As a bonus question, choose a random individual from the population. What is the probability that they are male?

    Let be the random child selected from the population. We want to find the probability that this child is a boy. There are a few ways to understand it.

    a. In each family, we’ve assumed that both sexes are equally likely. . So each time a child is added to the population, there is an equal chance it is a boy or a girl. So the proportion of the number of boys or the number of girls must be half, and hence .

    b. Think of the next generation being as determined by $M$ families that are currently reproducing. Let $B_i$ be the number of boys family $i$ produces and let $G_i$ be the number of girls. Then $B_i + G_i$ has the same distribution as $N$ listed above. The total number of boys produced by the $M$ families satisfies

    And similarly,

    Therefore, the proportion of boys, as is

    Interesting, huh?

HW 2 Due Fri Feb 6

  1. Let $N$ cards carry the distinct numbers $x_1,\ldots,x_N$. If two cards are drawn at random without replacement, show that the correlation coefficient $\rho$ between the numbers appearing on the two cards in $-1/(N-1)$.

    Solution

    Let $X,Y$ be the number on the first and second cards respectively.

    So we know $E[X]=E[Y]$ and $E[X^{2}]=E[Y^{2}]$. Let

    Then

    We know $\rho$ is the correlation coefficient of $X,Y$, the formula of $\rho$ is

    Also we know

    There is only $\E[XY]$ left to compute.

    Bring those formulas back to $\rho$, we have

  2. Let $X$ and $Y$ be independent random variables having distribution $F_x$ and $F_y$ respectively.

    • Let $Z = \max(X,Y)$. Express $F_Z(t)$ in terms of $F_X(s)$ and $F_Y(u)$.
    • Let $W = \min(X,Y)$. Express $F_W(t)$ in terms of $F_X(s)$ and $F_Y(u)$.

    Solution

    Since $X,Y$ are independent,

    since if $\max(X,Y) \leq t$ then both $X \leq t$ and $Y \leq t$.

  3. Let $U$ be Poisson distributed with parameter $\lambda$ and let $V = 1/(1+U)$. Find the expected value of $V$.

    Solution

  4. Let $U,V,W$ be independent random variables with equal variances $\sigma^2$. Let $X = U + V$ and let $Y = V - W$. Find the covariance of $X$ and $Y$.

    Solution

    As $U,V,W$ are independent,

    Another way to do this is to write

HW 1 Due Fri Jan 30

Homework will be a little long this week, since it is mostly review.

  1. Which topics about stochastic processes most interest you? (E.g. Brownian motion, stock market modeling, statistical physics, etc.)

  2. Given that you end up working as hard as you think you can work this semester, what grade do think you can achieve in this class? How important is the letter grade to you?

  3. Would you prefer a challenging class at a good pace so you can learn a lot, or do you prefer an easy-paced class so you can learn the material well?

  4. Of the three general principles for modeling probabilities stated in K&P chapter 1, in your opinion, which one applies best to the modeling of share prices in the stock market?

  5. Plot the distribution function

    • Determine the corresponding density function $f(x)$ in the three regions.
    • What is the mean of the distribution?
    • If $X$ is a random variable with distribution $F$, then evaluate $\mathbb{P}(1/4 \leq X \leq 3/4)$.

    Solution: The density function is

  6. Determine the distribution function, mean and variance corresponding to the triangular density

    Solution

  7. Let $1_{A}$ be the indicator random variable associated with an event $A$, defined to be one if $A$ occurs, and zero otherwise. Show

    • $1_{A^c} = 1 - 1_{A}$
    • $1_{A \cap B} = 1_{A} 1_{B} = \min( 1_{A}, 1_{B} )$
    • $1_{A \cup B} = \max( 1_{A}, 1_{B} )$.

    Solution:

    If $\omega \in A^{c}$, then

    If $\omega \in A$, then

    So $1_{A^{c}}=1-1_{A}$

    If $\omega\in A\cap B$, then

    If $\omega \notin A\cap B$, then $\omega$ is either in $X-(A\cup B)$, $A-B$ or $B-A$, we have

    So

    Note that $\min(1_{A},1_{B})$ has only two possible values ${1,0}$.

    So

    Note that $\max(1_{A},1_{B})$ has only two possible values ${1,0}$.

  8. A pair of dice is tossed. If the two outcomes are equal, the dice are tossed again, and the process is repeated. If the dice is unequal, their sum is recorded. Determine the probability mass function for the sum.

    Solution:

    There are only 30 possible outcomes. Among 30 outcomes, there are 2 outcomes of having 3, 4, 10 and 11 as their sum. 4 outcomes of having 5,6,8 and 9 as sum. And 6 outcomes of having 7 as sum.
    Define $X$= sum of two dice. Then sample space is ${3,4,5,6,7,8,9,10,11}$ And we have the following probability