Date: January 24, 2021

Topic: Maximum likelihood

Recall

How is the likelihood defined?

How do you find the maximum likelihood estimators for a distribution?

How is the variance of the estimator computed?

Definition

The method of maximum likelihood is a technique for generating estimators for the parameters of a statistical distribution. The method supposes that you have a sample $x_1, x_2,\dots,x_n$ of a random variable $X$ with a probability mass function or probability density function with a known form $f(x|\theta_1, \theta_2, \dots, \theta_k)$ where $\theta_1, \theta_2, \dots, \theta_k$ are unknown parameters. Given this information, you can define the likelihood function:

$$ L(\theta_1, \theta_2, \dots, \theta_k) = \prod_{i=1}^n f(x_i|\theta_1, \theta_2, \dots, \theta_k) $$

Maximum likelihood estimators for the parameters are found by maximising the expression above. In other words, the maximum likelihood estimators are the solutions of the $k$ equations:

$$ \frac{\partial L}{\partial \theta_1} = 0 \quad \frac{\partial L}{\partial \theta_2} = 0\quad \dots \quad \frac{\partial L}{\partial \theta_k} = 0 $$

The maximum likelihood estimators are asymptotically normally distributed and as such we can calculate the variance of the estimator of the parameter $\theta$ using:

$$ \text{var}(\hat{\theta}) = -\frac{1}{\mathbb{E}\left[\frac{\textrm{d}^2\ln L}{\textrm{d}\theta^2 }\right]} $$

A simple application of maximum likelihood is given in the following video:

https://www.youtube.com/watch?v=XjMrXBgmds0

A more complicated application is provided in this video:

https://www.youtube.com/watch?v=dN042KYhOuo

<aside> 📌 SUMMARY: The method of maximum likelihood is a technique for generating estimators for the parameters of sta istical distributions that works by maximising the likelihood that you would observe the results that have been observed.

</aside>