Individual differences |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |
where E is the expected value.
Intuitively, covariance is the measure of how much two variables vary together. That is to say, the covariance becomes more positive for each pair of values which differ from their mean in the same direction, and becomes more negative with each pair of values which differ from their mean in opposite directions. In this way, the more often they differ in the same direction, the more positive the covariance, and the more often they differ in opposite directions, the more negative the covariance.
The definition above is equivalent to the following formula which is commonly used in calculations:
If X and Y are independent, then their covariance is zero. This follows because under independence,
The converse, however, is not true: it is possible that X and Y are not independent, yet their covariance is zero. Random variables whose covariance is zero are called uncorrelated.
If X and Y are real-valued random variables and c is a constant ("constant", in this context, means non-random), then the following facts are a consequence of the definition of covariance:
For vector-valued random variables, cov(X, Y) and cov(Y, X) are each other's transposes.
The covariance is sometimes called a measure of "linear dependence" between the two random variables. That phrase does not mean the same thing that it means in a more formal linear algebraic setting (see linear dependence), although that meaning is not unrelated. The correlation is a closely related concept used to measure the degree of linear dependence between two variables.
de:Kovarianz (Stochastik) es:Covarianza fr:Covariance no:Kovarians pt:Covariância su:Kovarian
|This page uses Creative Commons Licensed content from Wikipedia (view authors).|