Expectation: Sums, Correlations (Continued) ========== Big picture: ------------------------------ Mixture: Y = X1 with probability P, X2 with probability 1-p = Z X1 + (1-Z) X2, where Z is Bernoulli (p) Mean: p mu1 + (1-p) mu2 Variance: ????? Matching Problems: 3 fun problems: 1. Secretary Problem (approximation, big N): View N numbers X1 X2 ... XN, sequentially; try to pick the best one. One strategy: Pick 0Np and Xn > M. What's the probability of success? Need: Best after k; prob = 1-p And: 2nd best before k prob = (1-p)^1 p/1 or: 3rd best before k prob = (1-p)^2 p/2 or: 4th best before k prob = (1-p)^3 p/3 or: 5th best before k prob = (1-p)^4 p/4 or: ... In sum: p sum_k=1^infty (1-p)^k/k (set q=1-p) = p f(q), where f'(q) = sum_k=0^oo q^k = 1/p & f(0)=0 so f(q) = -log(1-q) = -log(p) = -p log(p) d/dp: -1 -log(p) =0 --> log p = -1 ==> p = 1/e, Prob[Succ] = -plog(p) = 1/e = 0.36788 FOR ALL N 2. 2-Envelope problem: Ex 68 3. Monte Hall problem: Goat Goat $$$$ 4. Hats: Mean & Variance ---------------------------------- Conditional Expectation & Probability: E[X|Y] E[Sum of Random Number of Random Variables] Conditional Mean & Variance Formulas: E[ X ] = E[ E[X|Y] ] V[ X ] = E[ V[X|Y] ] + V[ E[X|Y] ] \__ = E[ (X - E[X|Y])^2 | Y ] = E[X^2|Y] - E[X|Y]^2 An Expectation Shortcut If P[ X>0 ] = 1, then oo oo x oo oo oo / // // / E[ X ] = | x f(x) dx = || f(x) dt dx = || f(x) dx dt = | [1-F(t)] dt / // // / 0 0 0 0 t 0 A Tool: MGF tX M(t) = E[e ] : M(0)=1; M'(0)= E[X]; M"(0)= E[X^2] -> E[X] = M' (0) = (log M(t))' V[X] = M''(0) - (M'(0))^2 = (log M(t))'' *****PREDICTION****** Minimize E[ |X-a|^2 ]: Easy to show answer is a=E[X]; thus also E[ |X-a|^2 | Y ] is minimized by a=E[ X | Y ]. Minimize E[ |X-a| ].... a little harder but interesting: a +oo / / m(a) = | (a-x) f(x) dx + | (x-a) f(x) dx = a(P[X<=a] - P[X>a]) + ... / / = a(2F(a) - 1) + ... -oo a m'(a) = (2F(a) - 1) + 2af(a) - 2af(a) = (2F(a) - 1) = 0 ==> F(a)=1/2; also of course E[ |X-a| |Y] is minimized by the conditional median. ---------------------------------- Chapter 7: 1. Introduction 2. Expectation of Sums of Random Variables 3. Covariance, Variance of Sums, and Correlations 4. Conditional Expectation .1 Definitions .2 Computing Expectations by Conditioning .3 Computing Probabilities by Conditioning 5. Conditional Expectation and Prediction 6. Moment Generating Functions 7. Additional Properties of Normal Random Variables .1 The Multivariate Normal Distribution .2 The Joint Distribution of the Sample Mean and Sample Variance 8. General Definition of Expectation