Expectation: Sums, Correlations (Continued) ========== Big picture: ------------------------------ Mixture: Y = X1 with probability P, X2 with probability 1-p = Z X1 + (1-Z) X2, where Z is Bernoulli (p) Mean: p mu1 + (1-p) mu2 Variance: ????? Matching Problems: 3 fun problems: 1. Secretary Problem (approximation, big N): View N numbers X1 X2 ... XN, sequentially; try to pick the best one. One strategy: Pick 0Np and Xn > M. What's the probability of success? Need: Best after k; prob = 1-p And: 2nd best before k prob = (1-p)^1 p/1 or: 3rd best before k prob = (1-p)^2 p/2 or: 4th best before k prob = (1-p)^3 p/3 or: 5th best before k prob = (1-p)^4 p/4 or: ... In sum: p sum_k=1^infty (1-p)^k/k (set q=1-p) = p f(q), where f'(q) = sum_k=0^oo q^k = 1/p & f(0)=0 so f(q) = -log(1-q) = -log(p) = -p log(p) d/dp: -1 -log(p) =0 --> log p = -1 ==> p = 1/e, Prob[Succ] = -plog(p) = 1/e = 0.36788 FOR ALL N 2. 2-Envelope problem: Ex 68 Choose at random from envelope with $X and one with $2X; open it to find $Y. Should you accept an option to trade for the other envelope? Note paradox: Since the other envelope is equally likely to have $Y/2 or $2Y, it's tempting to think the expected value if you trade would be 1/2(Y/2)+1/2(2Y)=(5/4)Y, a gain of 25% on average. BUT.... are these really equally likely after you observe Y??? Homework problem treats X as random; another option is to pick a threshhold Z from a continuous positive distribution and trade if Y0 ] = 1, then oo oo x oo oo oo / // // / E[ X ] = | x f(x) dx = || f(x) dt dx = || f(x) dx dt = | [1-F(t)] dt / // // / 0 0 0 0 t 0 Also for any p>0 oo oo x oo / p // p-1 / p-1 E[ X^p ] = | x f(x) dx = || p t f(x) dt dx = ... = | p t [1-F(t)] dt / // / 0 0 0 0 A Tool: MGF tX M(t) = E[e ] : M(0)=1; M'(0)= E[X]; M"(0)= E[X^2] -> E[X] = M'(0) = (log M(t))' V[X] = M"(0) - (M'(0))^2 = (log M(t))" *****PREDICTION****** Minimize E[ |X-a|^2 ]: Easy to show answer is a=E[X]; thus also E[ |X-a|^2 | Y ] is minimized by a=E[ X | Y ]. Minimize E[ |X-a| ].... a little harder but interesting: a +oo / / m(a) = | (a-x) f(x) dx + | (x-a) f(x) dx = a(P[X<=a] - P[X>a]) + ... / / = a(2F(a) - 1) + ... -oo a m'(a) = (2F(a) - 1) + 2af(a) - 2af(a) = (2F(a) - 1) = 0 ==> F(a)=1/2; also of course E[ |X-a| | Y ] is minimized by the conditional median. ---------------------------------- Chapter 7: 1. Introduction 2. Expectation of Sums of Random Variables 3. Covariance, Variance of Sums, and Correlations 4. Conditional Expectation .1 Definitions .2 Computing Expectations by Conditioning .3 Computing Probabilities by Conditioning 5. Conditional Expectation and Prediction 6. Moment Generating Functions 7. Additional Properties of Normal Random Variables .1 The Multivariate Normal Distribution .2 The Joint Distribution of the Sample Mean and Sample Variance 8. General Definition of Expectation