HW 7 Solutions

STA 211 Spring 2023 (Jiang)

Exercise 1

Consider again the estimate of the population variance for an \(N(\mu, \sigma^2)\) distribution given by \(\tilde\sigma^2 = \frac{1}{n}\sum-{i = 1}^n(X_i - \bar{X})^2\). Demonstrate that it is a consistent estimator.

You may use the fact that if you have an i.i.d. sample from a \(N(\mu, \sigma^2)\) distribution, then the variance of the sample variance, \(Var(s^2) = \frac{2\sigma^4}{n - 1}.\) For this problem you may assume \(\sigma^4 < \infty\).

From the homework last week, we have the bias and variance

\[\begin{align*} E(\tilde{\sigma}^2 - \sigma^2) &= \frac{\sigma^2}{n}\\ Var(\tilde{\sigma}^2) &= \frac{(n - 1) 2\sigma^4}{n^2} \end{align*}\]

From class, we know that an asymptotically unbiased estimator whose variance goes to zero as \(n \to \infty\) is consistent. Since both of the above terms go to zero, \(\tilde{\sigma}^2 \to_p \sigma^2\).

Exercise 2

Let \(X\) be 1 with probability 0.5 and 0 with probability 0.5. Let \(X_n = X\) and \(Y = 1 - X\). Show that \(X_n \to_d Y\) but that \(X_n \nrightarrow_p Y\). Hint: consider how “far apart” \(X_n\) and \(Y\) are, then use the definition of convergence in probability for an appropriate \(\epsilon\).

\(X\) and \(Y\) has precisely the same distribution - a Bernoulli with probability 0.5. Since \(F_{n} = F_X = F_Y\), \(X_n \to_d Y\). However, note that \(|X_n - Y| = 1\) for all \(n\), and so \(P(|X_n - Y| > \epsilon) = 1\) for all \(0 < \epsilon < 1\), and so \(X_n \nrightarrow_p Y\).

Exercise 3

Suppose \(X_1, \cdots, X_n\) are an i.i.d. sample from a distribution with density \(f_X(x) = \frac{\lambda x + 1}{2}\) for \(x \in (-1, 1)\) and \(\lambda \in (-1, 1)\). Consider the estimator \(3\bar{X}\). Is it a biased estimator for \(\lambda\)? Is it a consistent estimator for \(\lambda\)? Show your work and explain.

\[\begin{align*} E(x_i) &= \int_{-1}^1 x\frac{\lambda x + 1}{2} dx\\ &= \frac{\lambda}{2}\int_{-1}^1 x^2 + x dx\\ &= \frac{\lambda}{3}\\ E(3\bar{X}) &= \frac{3}{n}E\left(\sum_{i = 1}^n x_i\right)\\ &= \lambda \end{align*}\]

And hence the estimator \(3\bar{X}\) is unbiased for $.

\[\begin{align*} Var(x_i) &= \left(\int_{-1}^1 \frac{x^2(\lambda x + 1)}{2} dx \right) - \frac{\lambda^2}{9}\\ &= \frac{3 - \lambda^2}{9} \end{align*}\]

Thus, the variance of the estimator is

\[\begin{align*} Var(3\bar{X}) &= \frac{9}{n^2}\left( n\frac{3 - \lambda^2}{9} \right)\\ &= \frac{3 - \lambda^2}{n} \end{align*}\]

Since \[\begin{align*} \lim_{n \to \infty} Var(3\bar{X}) = \lim_{n \to \infty} \frac{3 - \lambda^2}{n} = 0 \end{align*}\]

and \(3\bar{X}\) is an unbiased estimator for \(\lambda\), \(3\bar{X} \to_p \lambda\).