## FANDOM

34,303 Pages

In probability theory, the characteristic function of any random variable completely defines its probability distribution. On the real line it is given by the following formula, where X is any random variable with the distribution in question:

$\varphi_X(t) = \operatorname{E}\left(e^{itX}\right)\,$

where t is a real number, i is the imaginary unit, and E denotes the expected value.

If FX is the cumulative distribution function, then the characteristic function is given by the Riemann-Stieltjes integral

$\operatorname{E}\left(e^{itX}\right) = \int_\Omega e^{itx}\,dF_X(x).\,$

In cases in which there is a probability density function, fX, this becomes

$\operatorname{E}\left(e^{itX}\right) = \int_{-\infty}^{\infty} e^{itx} f_X(x)\,dx.$

If X is a vector-valued random variable, one takes the argument t to be a vector and tX to be a dot product.

Every probability distribution on R or on Rn has a characteristic function, because one is integrating a bounded function over a space whose measure is finite.

## The inversion theorem Edit

More than that, there is a bijection between cumulative probability distribution functions and characteristic functions. In other words, two distinct probability distributions never share the same characteristic function.

Given a characteristic function φ, it is possible to reconstruct the corresponding cumulative probability distribution function F:

$F_X(y) - F_X(x) = \lim_{\tau \to +\infty} \frac{1} {2\pi} \int_{-\tau}^{+\tau} \frac{e^{-itx} - e^{-ity}} {it}\, \varphi_X(t)\, dt.$

In general this is an improper integral; the function being integrated may be only conditionally integrable rather than Lebesgue integrable, i.e. the integral of its absolute value may be infinite.

## The continuity theorem Edit

If the sequence of characteristic functions of distributions Fn converges to the characteristic function of a distribution F, then Fn(x) converges to F(x) at every value of x at which F is continuous.

## Uses of characteristic functions Edit

Characteristic functions are particularly useful for dealing with functions of independent random variables. For example, if X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, and

$S_n = \sum_{i=1}^n a_i X_i,\,\!$

where the ai are constants, then the characteristic function for Sn is given by

$\varphi_{S_n}(t)=\varphi_{X_1}(a_1t)\varphi_{X_2}(a_2t)\cdots \varphi_{X_n}(a_nt). \,\!$

In particular, $\varphi_{X+Y}(t) = \varphi_X(t)\varphi_Y(t)$. To see this, write out the definition of characteristic function:

$\varphi_{X+Y}(t)=E\left(e^{it(X+Y)}\right)=E\left(e^{itX}e^{itY}\right)=E\left(e^{itX}\right)E\left(e^{itY}\right)=\varphi_X(t) \varphi_Y(t)$.

Observe that the independence of $X$ and $Y$ is required to establish the equality of the third and fourth expressions.

Because of the continuity theorem, characteristic functions are used in the most frequently seen proof of the central limit theorem.

Characteristic functions can also be used to find moments of random variable. Provided that nth moment exists, characteristic function can be differentiated n times and

$\operatorname{E}\left(X^n\right) = i^{-n}\, \varphi_X^{(n)}(0) = i^{-n}\, \left[\frac{d^n}{dt^n} \varphi_X(t)\right]_{t=0}. \,\!$

Characteristic functions arise in the statement and proof of Bochner's theorem.

## Related concepts Edit

Related concepts include the moment-generating function and the probability-generating function. The characteristic function exists for all probability distributions. However this is not the case for moment generating function.

The characteristic function is closely related to the Fourier transform: the characteristic function of a probability density function $p(x)$ is the complex conjugate of the continuous Fourier transform of $p(x)$ (according to the usual convention; see ).

$\varphi_X(t) = \langle e^{itX} \rangle = \int_{-\infty}^{\infty} e^{itx}p(x)\, dx = \overline{\left( \int_{-\infty}^{\infty} e^{-itx}p(x)\, dx \right)} = \overline{P(t)},$

where $P(t)$ denotes the continuous Fourier transform of the probability density function $p(x)$. Likewise, $p(x)$ may be recovered from $\varphi_X(t)$ through the inverse Fourier transform:

$p(x) = \frac{1}{2\pi} \int_{-\infty}^{\infty} e^{itx} P(t)\, dt = \frac{1}{2\pi} \int_{-\infty}^{\infty} e^{itx} \overline{\varphi_X(t)}\, dt.$

Indeed, even when the random variable does not have a density, the characteristic function may be seen as the Fourier transform of the measure corresponding to the random variable.

de:Charakteristische Funktion (Stochastik)
eo:Vikipedio:Projekto matematiko/Karakteriza funkcio (probabloteorio)
it:Funzione indicatrice
pl:Funkcja charakterystyczna
ru:Характеристическая функция случайной величины