Expectation: Sums, Correlations ========== Big picture: ALWAYS the MEAN of a sum is equal to the sum of the means ONLY for INDEPENDENT random variables is the VARIANCE equal to the sum of the variances NEVER is the sum of standard deviations, probabilities, etc. equal to a useful sum. MORAL: A good strategy for computing expectations, variances, and probabilities is to turn them into sums-- indicator random variables are often helpful for this. EXAMPLES: Ex 4: Fair die is tossed 10 times; E[Sum] = ???? ------------------------------ Ex 7: S = "total # of succ in 3 equi-probable trials" E[S] = 1.8 Max, min prob X=3???? S = X1 + X2 + X3 (Xi = 1 if ith trial is success, else 0) Know: Xi ~ Bernoulli with P[Xi=1] = 0.6, but not independent. 0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 | | | | | | | | | | | A1: ******************************* A2: ******************************* A3: **************** **************** ==> P[S=3] = 0!!!!! B1: ******************************* B2: ******************************* B3: ******************************* ==> P[S=3] = 0.6 ------------------------------ Ex 28: Compute expected # of cards needed to find: a. 2 aces; b. 5 spades; c. all 13 hearts. Let Xi = 1 if card i is drawn BEFORE 2 aces are drawn, else Xi = 0; then look at S = (X1 + X2 + ... + X52): E[Xi] = P[0 or 1 aces preceed card #i] = P[Of 4 aces and card #i, #i appears 1st or 2nd] = 2/5 for non-aces, = 2/4 for aces; ==> E[S] = (48 * 2/5) + (4 * 2/4) = 19.2 + 2 = 21.2 Yi = 1 if card i is drawn BEFORE 5 spades, else Yi = 0; then look at S = (Y1 + X2 + ... + Y52): E[Yi] = P[0..4 spades preceed card #i] = P[Of 13 spades and card #i, #i appears 1st or ... or 5th] = 5/14 for non-spades, = 5/13 for spades. ==> E[S] = (39 * 5/14) + (13 * 5/13) = 13.929 + 5 = 18 13/14 = 18.9286 ------------------------------ Binomial: Y = X1 + X2 + ... + Xn Mean: n*p Variance: n*p*(1-p) Hypergeometric: Y = X1 + X2 + ... + Xn Mean: n*(A/(A+B)) Variance: ????? Mixture: Y = X1 with probability p, X2 with probability 1-p = Z X1 + (1-Z) X2, where Z is Bernoulli (p) Mean: p mu1 + (1-p) mu2 Variance: ????? Matching Problems: 3 fun problems: 1. Secretary Problem (approximation, big N): View N numbers X1 X2 ... XN, sequentially; try to pick the best one. One strategy: Pick 0
Np and Xn > M. What's the probability of success? Need: Best after k; prob = 1-p And: 2nd best before k prob = (1-p)^1 p/1 or: 3rd best before k prob = (1-p)^2 p/2 or: 4th best before k prob = (1-p)^3 p/3 or: 5th best before k prob = (1-p)^4 p/4 or: ... In sum: p sum_k=1^infty (1-p)^k/k (set q=1-p) = p f(q), where f'(q) = sum_k=0^oo q^k = 1/p & f(0)=0 so f(q) = -log(1-q) = -log(p) = -p log(p) d/dp: -1 -log(p) =0 --> log p = -1 ==> p = 1/e, Prob[Succ] = -plog(p) = 1/e = 0.36788 FOR ALL N 2. 2-Envelope problem: Ex 68 3. Monte Hall problem: Goat Goat $$$$ ---------------------------------- Conditional Expectation & Probability: E[ X | Y ] E[ Sum of Random Number of Random Variables ] Minimize E[ |X-a| ].... a +oo / / m(a) = | (a-x) f(x) dx + | (x-a) f(x) dx = a(P[X<=a] - P[X>a]) + ... / / = a(2F(a) - 1) + ... -oo a m'(a) = (2F(a) - 1) + 2af(a) - 2af(a) = (2F(a) - 1) = 0 ==> F(a)=1/2 ---------------------------------- E[Geo]: If X = #tails before 1st head, E[ X ] = p*0 + q*(1+E[X]) => (1-q)E[X] = q => E[X] = q/p If X=#TOSSES for 1st head, E[ X ] = p*1 + q*(1+E[X]) => (1-q)E[X] = 1 => E[X] = 1/p ---------------------------------- Chapter 7: 1. Introduction 2. Expectation of Sums of Random Variables 3. Covariance, Variance of Sums, and Correlations 4. Conditional Expectation .1 Definitions .2 Computing Expectations by Conditioning .3 Computing Probabilities by Conditioning 5. Conditional Expectation and Prediction 6. Moment Generating Functions 7. Additional Properties of Normal Random Variables .1 The Multivariate Normal Distribution .2 The Joint Distribution of the Sample Mean and Sample Variance 8. General Definition of Expectation