The task today is to graphically illustrate how the method of
maximum likelihood works.
At the beginning of the lab, the TA will review the following contents: suppose we have n observations from a Exponential distribution with unknown parameter beta,
% y records 10 observations from Exp(beta) where beta is unknown.
y = [1.02, .49, .17, .64, 3.65, 1.89, .14, .18, 1.87, .24];
n=length(y);
subplot(1,2,1);
beta = 0.1:0.25:6;
logL = -n.*log(beta) - sum(y)./beta; % calculate the Log Likelihood
plot(beta, logL); % plot the Log Likelihood vs beta where beta is
% ranged from 0.1 to 6.
subplot(1,2,2);
beta = 0.5:0.01:2; % From the previous plot, we are sure that the
% maximum occurs when beta is between 0.5 and
% 2. So we plot the Log Likelihood in this range
% with finer grid.
logL = -n.*log(beta) - sum(y)./beta;
plot(beta, logL);
xlabel('beta');
ylabel('Log Likelihood');
mle=mean(y);
logL_mle = -n*log(mle) - sum(y)/mle; hold on;
plot(mle, logL_mle, 'or'); hold off;
% mark the mle in red circle. See it does maximize the Log Likelihood.
legend('Plot of Log likelihood Function', 'MLE');
: In HW#5, you
will be asked to derive the mle for Bernoulli Distribution with unknown
parameter p (Exercise 8.8 on page 351). You will find that the mle of p is
equal to (y_1 + y_2 + ...+y_n)/n. Now suppose I toss a coin (head with probability
p which is unknown) 10 times and observe the following:Tail, Head, Tail, Tail, Head, Head, Head, Tail, Head, Head