Assessment |
Biopsychology |
Comparative |
Cognitive |
Developmental |
Language |
Individual differences |
Personality |
Philosophy |
Social |

Methods |
Statistics |
Clinical |
Educational |
Industrial |
Professional items |
World psychology |

**Statistics:**
Scientific method ·
Research methods ·
Experimental design ·
Undergraduate statistics courses ·
Statistical tests ·
Game theory ·
Decision theory

*See method of moments (probability theory) for an account of a technique for proving convergence in distribution.*

In statistics, the **method of moments** is a method of estimation of population parameters such as mean, variance, median, etc. (which need not be moments), by equating sample moments with unobservable population moments and then solving those equations for the quantities to be estimated.

## ExampleEdit

Suppose *X*_{1}, ..., *X*_{n} are independent identically distributed random variables with a gamma distribution with probability density function

- $ {x^{\alpha-1} e^{-x/\beta} \over \beta^\alpha\, \Gamma(\alpha)} \,\! $

for *x* > 0, and 0 for *x* < 0.

The first moment, i.e., the expected value, of a random variable with this probability distribution is

- $ \operatorname{E}(X_1)=\alpha\beta\, $

and the second moment, i.e., the expected value of its square, is

- $ \operatorname{E}(X_1^2)=\beta^2\alpha(\alpha+1).\, $

These are the "population moments".

The first and second "sample moments" *m*_{1} and *m*_{2} are respectively

- $ m_{1} = {X_1+\cdots+X_n \over n} \,\! $

and

- $ m_{2} = {X_1^2+\cdots+X_n^2 \over n}.\,\! $

Equating the population moments with the sample moments, we get

- $ \alpha\beta = m_{1} \,\! $

and

- $ \beta^2\alpha(\alpha+1) = m_{2}.\,\! $

Solving these two equations for α and β, we get

- $ \alpha={ m_{1}^2 \over m_{2} - m_{1}^2}\,\! $

and

- $ \beta={ m_{2} - m_{1}^2 \over m_{1}}.\,\! $

We then use these two quantities as estimates, based on the sample, of the two unobservable population parameters α and β.

## Advantages and disadvantages of this method Edit

In some respects, this method was superseded by Fisher's method of maximum likelihood, because maximum likelihood estimators have higher probability of being close to the quantities to be estimated.

However, in some cases, as in the above example of the gamma distribution, the likelihood equations may be intractable without computers, whereas the method-of-moments estimators can be quickly and easily calculated by hand as shown above.

Estimates by the method of moments may be used as the first approximation to the solutions of the likelihood equations, and successive improved approximations may then be found by the Newton-Raphson method. In this way the method of moments and the method of maximum likelihood are symbiotic.

In some cases, infrequent with large samples but not so infrequent with small samples, the estimates given by the method of moments are outside of the parameter space; it does not make sense to rely on them then. That problem never arises in the method of maximum likelihood. Also, estimates by the method of moments are not necessarily sufficient statistics, i.e., they sometimes fail to take into account all relevant information in the sample.

## See alsoEdit

- ru:Метод моментов нахождения оценок

This page uses Creative Commons Licensed content from Wikipedia (view authors). |