\Question
\textbf{Projection Property}
Use the Projection Property to answer the following questions.
\begin{enumerate}
\item Prove or disprove: for any function $\phi$, $E[(E[Y \mid X]) \phi(X)] = 0$.
\item Prove or disprove: $E[(Y - E[Y \mid X]) L[Y \mid X]] = 0$.
\item Prove that the constant $c$ which minimizes $E[(X - c)^2]$ is $c = E[X]$. Use the fact that $E[X - E[X]] = 0$. (\textit{Note}: Although it is possible to directly minimize $E[(X - c)^2]$ by differentiating, we would like you to try to emulate the proofs that the LLSE/MMSE minimize the mean-squared error.)
\item Prove the following: $E[X^2 \mid Y] = E[(X - E[X \mid Y])^2 \mid Y] + (E[X \mid Y])^2$. (\textit{Hint}: In the expression $E[X^2 \mid Y]$, try replacing $X$ with $X - E[X \mid Y] + E[X \mid Y]$.)
\item Use the result above to compute $E[X^2]$. (Use the law of iterated expectation.)
\item We have already shown that $E[E[Y \mid X]] = E[Y]$. Prove that $E[L[Y \mid X]] = E[Y]$.
\end{enumerate}
\Question
\textbf{Quadratic Regression}
In this question, we will find the best quadratic estimator of $Y$ given $X$. First, some notation: let $\mu_i$ be the $i$th moment of $X$, i.e.\ $\mu_i = E[X^i]$. Also, define $\beta_1 = E[XY]$ and $\beta_2 = E[X^2 Y]$. For simplicity, we will assume that $E[X] = E[Y] = 0$ and $E[X^2] = E[Y^2] = 1$. (Note that this poses no loss of generality, because we can always transform the random variables by subtracting their means and dividing by their standard deviations.) We claim that the best quadratic estimator of $Y$ given $X$ is
\[
\hat{Y} = \frac{1}{\mu_3^2 - \mu_4 + 1} (a X^2 + b X + c)
\]
where
\begin{align*}
a &= \mu_3 \beta_1 - \beta_2 \\
b &= (1 - \mu_4) \beta_1 + \mu_3 \beta_2 \\
c &= -\mu_3 \beta_1 + \beta_2
\end{align*}
Your task is to prove the Projection Property for $\hat{Y}$.
\begin{enumerate}
\item Prove that $E[Y - \hat{Y}] = 0$.
\item Prove that $E[(Y - \hat{Y})X] = 0$.
\item Prove that $E[(Y - \hat{Y})X^2] = 0$.
\end{enumerate}
\Question
\textbf{Balls in Bins Estimation}
We throw $n > 0$ balls into $m \geq 2$ bins. Let $X$ and $Y$ represent the number of balls that land in bin $1$ and $2$ respectively.
\begin{enumerate}
\item Calculate $E[Y \mid X]$. (\textit{Hint}: Your intuition may be more useful than formal calculations.)
\item What are $L[Y \mid X]$ and $Q[Y \mid X]$ (where $Q[Y \mid X]$ is the best quadratic estimator of $Y$ given $X$)? (\textit{Hint}: Your justification should be no more than two or three sentences, no calculations necessary! Think carefully about the meaning of the MMSE.)
\item Unfortunately, your friend is not convinced by your answer to the previous part. Compute $E[X]$ and $E[Y]$.
\item Compute $\operatorname{var}(X)$.
\item Compute $\operatorname{cov}(X, Y)$.
\item Compute $L[Y \mid X]$ using the formula. Ensure that your answer is the same as your answer to part (b).
\end{enumerate}
\Question
\textbf{Iterated Expectation}
In this question, we will try to achieve more familiarity with the law of iterated expectation.
\begin{enumerate}
\item Which of the following are valid conditional expectations? (No justification necessary.)
\begin{enumerate}
\item $E[X \mid Y] = 1$
\item $E[X \mid Y] = X$
\item $E[X \mid Y] = Y$
\item $E[X \mid Y] = Y/\cos(Y)$
\item $E[X \mid Y] = XY$
\end{enumerate}
\item You lost your phone charger! It will take $D$ days for the new phone charger you ordered to arrive at your house (here, $D$ is a random variable). Suppose that on day $i$, the amount of battery you lose is $B_i$, where $E[B_i] = \beta$. Let $B = \sum_{i=1}^{D} B_i$ be the total amount of battery drained between now and when your new phone charger arrives. Apply the law of iterated expectation to show that $E[B] = \beta E[D]$. (Here, the law of iterated expectation has a very clear interpretation: the amount of battery you expect to drain is the average number of days it takes for your phone charger to arrive, multiplied by the average amount of battery drained per day.)
\item Consider now the setting of independent Bernoulli trials, each with probability of success $p$. Let $S_i$ be the number of successes in the first $i$ trials. Compute $E[S_m \mid S_n]$. (You will need to consider three cases based on whether $m > n$, $m = n$, or $m < n$. Try using your intuition rather than proceeding by calculations.)
\end{enumerate}
\Question
\textbf{Gambling Woes}
Forest proposes a gambling game to you (uh oh!). Every day, you flip two independent fair coins. If both of the coins come up heads, then your fortune triples on that day. If one coin comes up heads and the other coin comes up tails, then your fortune is cut in half. If both of the coins comes up tails, then game over: you lose all of your money! Forest claims that you can get rich quickly with this scheme, but you decide to calculate some probabilities first.
\begin{enumerate}
\item Let $M_0$ denote your money at the start of the game, and let $M_n$ denote the amount of money you have at the end of the $n$th day. Compute $E[M_{n+1} \mid M_n]$.
\item Use the law of iterated expectation to calculate $E[M_{n+1}]$ in terms of $E[M_n]$. Solve your recurrence to obtain an expression for $E[M_{n+1}]$. Do you think this is a fair game?
\item Calculate $P(M_n > 0)$. What is the behavior as $n \to \infty$? Would you still play this game?
\end{enumerate}