Discrete probability distribution

In mathematics, a probability distribution is called discrete, if it is fully characterized by a probability mass function. Thus, the distribution of a random variable X is discrete, and X is then called a discrete random variable, if


 * $$\sum_u \Pr(X=u) = 1\qquad\qquad\qquad(1)$$

as u runs through the set of all possible values of X.

If a random variable is discrete, then the set of all values that it can assume with nonzero probability is finite or countably infinite, because the sum of uncountably many positive real numbers (which is the smallest upper bound of the set of all finite partial sums) always diverges to infinity.

In the cases most often considered, this set of possible values is a topologically discrete set in the sense that all its points are isolated points. But there are discrete random variables for which this countable set is dense on the real line.

The Poisson distribution, the Bernoulli distribution, the binomial distribution, the geometric distribution, and the negative binomial distribution are among the most well-known discrete probability distributions.

Alternative description
Equivalently to the above, a discrete random variable can be defined as a random variable whose cumulative distribution function (cdf) increases only by jump discontinuities &mdash; that is, its cdf increases only where it "jumps" to a higher value, and is constant between those jumps. The points where jumps occur are precisely the values which the random variable may take. The number of such jumps may be finite or countably infinite. The set of locations of such jumps need not be topologically discrete; for example, the cdf might jump at each rational number.

Representation in terms of indicator functions
For a discrete random variable X, let u0, u1, ... be the values it can assume with non-zero probability. Denote


 * $$\Omega_i=\{\omega: X(\omega)=u_i\},\, i=0, 1, 2, \dots$$

These are disjoint sets, and by formula (1)


 * $$\Pr\left(\bigcup_i \Omega_i\right)=\sum_i \Pr(\Omega_i)=\sum_i\Pr(X=u_i)=1.$$

It follows that the probability that X assumes any value except for u0, u1, ... is zero, and thus one can write X as


 * $$X=\sum_i \alpha_i 1_{\Omega_i}$$

except on a set of probability zero, where $$\alpha_i=\Pr(X=u_i)$$ and $$1_A$$ is the indicator function of A. This may serve as an alternative definition of discrete random variables.

Discrete stochastische variabele Variável aleatória discreta Variabel random diskrit