Logistic distribution

In probability theory and statistics, the logistic distribution is a continuous probability distribution. Its cumulative distribution function is the logistic function, which appears in logistic regression and feedforward neural networks. It resembles the normal distribution in shape but has heavier tails (higher kurtosis).

Cumulative distribution function
The logistic distribution receives its name from its cumulative distribution function (cdf), which is an instance of the family of logistic functions:


 * $$F(x; \mu,s) = \frac{1}{1+e^{-(x-\mu)/s}} \!$$
 * $$= \frac12 + \frac12 \;\operatorname{tanh}\!\left(\frac{x-\mu}{2\,s}\right).$$

In this equation, x is the random variable, μ is the mean, and s is a parameter proportional to the standard deviation.

Probability density function
The probability density function (pdf) of the logistic distribution is given by:


 * $$f(x; \mu,s) = \frac{e^{-(x-\mu)/s}} {s\left(1+e^{-(x-\mu)/s}\right)^2} \!$$
 * $$=\frac{1}{4\,s} \;\operatorname{sech}^2\!\left(\frac{x-\mu}{2\,s}\right).$$

Because the pdf can be expressed in terms of the square of the hyperbolic secant function "sech", it is sometimes referred to as the sech-square(d) distribution.


 * See also: hyperbolic secant distribution

Quantile function
The inverse cumulative distribution function of the logistic distribution is $$F^{-1}$$, a generalization of the logit function, defined as follows:


 * $$F^{-1}(p; \mu,s) = \mu + s\,\ln\left(\frac{p}{1-p}\right).$$

Alternative parameterization
An alternative parameterization of the logistic distribution, in terms of the variance &sigma;2, can be derived using the substitution $$\sigma^2 = \pi^2\,s^2/3$$. This yields the following density function:


 * $$g(x;\mu,\sigma) = f(x;\mu,\sigma\sqrt{3}/\pi) = \frac{\pi}{\sigma\,4\sqrt{3}} \,\operatorname{sech}^2\!\left(\frac{\pi}{2 \sqrt{3}} \,\frac{x-\mu}{\sigma}\right).$$

Applications
The logistic distribution and the S-shaped pattern that results from it have been extensively used in many different areas, including:
 * Biology – to describe how species populations grow in competition
 * Epidemiology – to describe the spreading of epidemics
 * Psychology – to describe learning
 * Technology – to describe how new technologies diffuse and substitute for each other
 * Marketing – the diffusion of new-product sales
 * Energy – the diffusion and substitution of primary energy sources, as in the Hubbert curve
 * Hydrology - In hydrology the distribution of long duration river discharge and rainfall (e.g. monthly and yearly totals, consisting of the sum of 30 respectively 360 daily values) is often thought to be almost normal according to the central limit theorem. The normal distribution, however, needs a numeric approximation. As the logistic distribution, which can be solved analytically, is similar to the normal distribution, it can be used instead. The blue picture illustrates an example of fitting the logistic distribution to ranked October rainfalls - that are almost normally distributed - and it shows the 90% confidence belt based on the binomial distribution. The rainfall data are represented by plotting positions as part of the cumulative frequency analysis.
 * Physics - the cdf of this distribution describes a Fermi gas and more specifically the number of electrons within a metal that can be expected to occupy a given quantum state. Its range is between 0 and 1, reflecting the Pauli exclusion principle. The value is given as a function of the kinetic energy corresponding to that state and is parametrized by the Fermi energy and also the temperature (and Boltzmann constant). By changing the sign in front of the "1" in the denominator, one goes from Fermi–Dirac statistics to Bose–Einstein statistics. In this case, the expected number of particles (bosons) in a given state can exceed unity, which is indeed the case for systems such as lasers.

Both the United States Chess Federation and FIDE have switched their formulas for calculating chess ratings from the normal distribution to the logistic distribution; see Elo rating system.

Related distributions

 * If $$X \sim \textrm{Logistic}(\mu,\beta)\, $$ then $$ k X + loc \sim \textrm{Logistic}(k \mu + loc,k \beta)\, $$
 * Logistic distribution mimics Sech distribution
 * If $$X \sim U(0,1)\, $$ (Uniform distribution (continuous)) then $$ \mu + \beta \left(\log{(X)} - \log{(1-X)} \right) \sim \textrm{Logistic}(\mu,\beta)\, $$
 * If $$X \sim \mathrm{Exponential}(1)\,$$ (Exponential distribution) then $$\mu-\beta\log{\tfrac{e^{-X}}{1-e^{-X}}} \sim \mathrm{Logistic}(\mu,\beta) $$
 * If $$X \sim \mathrm{Exponential}(1)\,$$ and $$Y \sim \mathrm{Exponential}(1)\,$$ then $$\mu-\beta\log{\tfrac{X}{Y}} \sim \mathrm{Logistic}(\mu,\beta) $$
 * If $$X \sim \mathrm{Gumbel}(\alpha,\beta)\,$$ and $$Y \sim \mathrm{Gumbel}(\alpha,\beta)\,$$ (Gumbel distribution) then $$X-Y \sim \mathrm{Logistic}(0,\beta) \,$$
 * If $$X \sim \mathrm{GEV}(\alpha,\beta,0)\,$$ and $$Y \sim \mathrm{GEV}(\alpha,\beta,0)\,$$ (Generalized extreme value distribution) then $$X-Y \sim \mathrm{Logistic}(0,\beta) \,$$
 * If $$X \sim \mathrm{Gumbel}(\alpha,\beta)\,$$ and $$Y \sim \mathrm{GEV}(\alpha,\beta,0)\,$$ then $$X+Y \sim \mathrm{Logistic}(2 \alpha,\beta) \,$$
 * If $$\log{X} \sim \textrm{Logistic}\, $$ then $$ X \sim \textrm{LogLogistic}\, $$ (log-logistic distribution) and $$ X-a \sim \textrm{ShiftedLogLogistic}\, $$ (shifted log-logistic distribution)

Higher order moments
The n-th order central moment can be expressed in terms of the quantile function:
 * $$\begin{align}

\operatorname{E}[(X-\mu)^n] &= \int_{-\infty}^\infty (x-\mu)^n dF(x) = \int_0^1 \big(F^{-1}(p)-\mu\big)^n dp \\ &= s^n \int_0^1 \Big[ \ln\!\Big(\frac{p}{1-p}\Big) \Big]^n \, dp. \end{align}$$ This integral is well-known and can be expressed in terms of Bernoulli numbers:

\operatorname{E}[(X-\mu)^n] = s^n\pi^n(2^n-2)\cdot|B_n|. $$