FANDOM


Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Statistics: Scientific method · Research methods · Experimental design · Undergraduate statistics courses · Statistical tests · Game theory · Decision theory


This article defines some terms which characterize probability distributions of two or more variables.

Conditional probability is the probability of some event A, given the occurrence of some other event B. Conditional probability is written P(A|B), and is read "the probability of A, given B".

Joint probability is the probability of two events in conjunction. That is, it is the probability of both events together. The joint probability of A and B is written $ P(A \cap B) $ or $ P(A, B). $

Marginal probability is the probability of one event, regardless of the other event. Marginal probability is obtained by summing (or integrating, more generally) the joint probability over the unrequired event. This is called marginalization. The marginal probability of A is written P(A), and the marginal probability of B is written P(B).

In these definitions, note that there need not be a causal or temporal relation between A and B. A may precede B, or vice versa, or they may happen at the same time. A may cause B, or vice versa, or they may have no causal relation at all.

Conditioning of probabilities, i.e. updating them to take account of (possibly new) information, may be achieved through Bayes' theorem.

DefinitionEdit

Given events (or subsets) A and B in the sample space (also termed by some textbooks as the universe) $ \Omega $, if it is known that an element randomly drawn from $ \Omega $ belongs to B, then the probability that it also belongs to A is defined to be the conditional probability of A, given B. From this definition, one can derive the following formula

$ P(A \mid B)= \frac{\vert A \cap B \vert}{\vert B \vert} $

Now, divide the denominator and numerator by $ \vert \Omega \vert $ to obtain

$ P(A \mid B)= \frac{P(A \cap B)}{P(B)}. $

Equivalently, we have

$ P(A \cap B)=P(A\mid B) P(B). $

Statistical independenceEdit

Two random events A and B are statistically independent if and only if

$ P(A \cap B) \ = \ P(A) P(B). $

Thus, if A and B are independent, then their joint probability can be expressed as a simple product of their individual probabilities.

Equivalently, for two independent events A and B,

$ P(A|B) \ = \ P(A) $

and

$ P(B|A) \ = \ P(B). $

In other words, if A and B are independent, then the conditional probability of A, given B is simply the individual probability of A alone; likewise, the probability of B given A is simply the probability of B alone.

Mutual exclusivityEdit

Two events A and B are mutually exclusive if and only if

$ P(A \cap B) = 0 $

as long as

$ P(A) \ne 0 $

and

$ P(B) \ne 0. $

Then

$ P(A\mid B) = 0 $

and

$ P(B\mid A) = 0. $

In other words, the probability of A happening, given that B happens, is nil since A and B cannot both happen in the same situation; likewise, the probability of B happening, given that A happens, is also nil.

Other considerationsEdit

  • If $ B $ is an event and $ P(B) > 0 $, then the function $ Q $ defined by $ Q(A) = P(A|B) $ for all events $ A $ is a probability measure.
  • If $ P(B)=0 $, then $ P(A|B) $ is left undefined.
  • Conditional probability can be calculated with a decision tree.

The conditional probability fallacyEdit

The conditional probability fallacy is the assumption that P(A|B) is approximately equal to or is influenced by P(B|A). The mathematician John Allen Paulos discusses this in his book Innumeracy, where he points out that it is a mistake often made even by doctors, lawyers, and other highly educated non-statisticians. It can be overcome by describing the data in actual numbers rather than probabilities.

See alsoEdit

Bvn-small Probability distributions [[[:Template:Tnavbar-plain-nodiv]]]
Univariate Multivariate
Discrete: BernoullibinomialBoltzmanncompound PoissondegeneratedegreeGauss-Kuzmingeometrichypergeometriclogarithmicnegative binomialparabolic fractalPoissonRademacherSkellamuniformYule-SimonzetaZipfZipf-Mandelbrot Ewensmultinomial
Continuous: BetaBeta primeCauchychi-squareDirac delta functionErlangexponentialexponential powerFfadingFisher's zFisher-TippettGammageneralized extreme valuegeneralized hyperbolicgeneralized inverse GaussianHotelling's T-squarehyperbolic secanthyper-exponentialhypoexponentialinverse chi-squareinverse gaussianinverse gammaKumaraswamyLandauLaplaceLévyLévy skew alpha-stablelogisticlog-normalMaxwell-BoltzmannMaxwell speednormal (Gaussian)ParetoPearsonpolarraised cosineRayleighrelativistic Breit-WignerRiceStudent's ttriangulartype-1 Gumbeltype-2 GumbeluniformVoigtvon MisesWeibullWigner semicircle DirichletKentmatrix normalmultivariate normalvon Mises-FisherWigner quasiWishart
Miscellaneous: Cantorconditionalexponential familyinfinitely divisiblelocation-scale familymarginalmaximum entropy phase-typeposterior priorquasisampling
</center>de:Bedingte Wahrscheinlichkeit

es:Probabilidad condicionada eo:Vikipedio:Projekto matematiko/Kondiĉa probablo fr:Probabilité conditionnellenl:Voorwaardelijke kansru:Условная вероятность su:Conditional probability sv:Betingad sannolikhet vi:Xác suất có điều kiện uk:Умовна ймовірність

This page uses Creative Commons Licensed content from Wikipedia (view authors).
Community content is available under CC-BY-SA unless otherwise noted.