Cauchy-Schwarz inequality

In mathematics, the Cauchy–Schwarz inequality, also known as the Schwarz inequality, the Cauchy inequality, or the Cauchy–Bunyakovski–Schwarz inequality, (named after Augustin Louis Cauchy, Viktor Yakovlevich Bunyakovsky and Hermann Amandus Schwarz), is a useful inequality encountered in many different settings, such as linear algebra applied to vectors, in analysis applied to infinite series and integration of products, and in probability theory, applied to variances and covariances.

Statement of the inequality
The statement of the Cauchy–Schwarz inequality is:

For all vectors $$x$$ and $$y$$ of a real or complex inner product space $$V$$ the following inequality holds:


 * $$|\langle x,y\rangle|^2 \leq \langle x,x\rangle \cdot \langle y,y\rangle.$$

or, equivalently, by taking the square root of both sides, and referring to the norms of the vectors:


 * $$ |\langle x,y\rangle| \leq \|x\| \cdot \|y\|.\, $$

Moreover, the two sides are equal if and only if $$x$$ and $$y$$ are linearly dependent (or, in a geometrical sense, they are parallel).

If $$x_1,\cdots, x_n\in\mathbb C$$ and $$y_1,\cdots, y_n\in\mathbb C$$ are the components of $$x$$ and $$y$$ with respect to an orthonormal basis of $$V$$ the inequality may be restated in more an explicit way:


 * $$|\overline{x_1} y_1 + \cdots + \overline{x_n} y_n|^2 \leq (|x_1|^2 + \cdots + |x_n|^2) (|y_1|^2 + \cdots + |y_n|^2).$$

Equality holds if and only if:


 * $$\frac {x_1}{y_1} = \frac {x_2}{y_2} = \cdots = \frac {x_n}{y_n}.$$

The finite-dimensional case of this inequality for real vectors was proved by Cauchy in 1821, and in 1859 Cauchy's student V.Ya. Bunyakovsky noted that by taking limits one can obtain an integral form of Cauchy's inequality. The general result for an inner product space was obtained by K.H.A.Schwarz in 1885.

Proof
As the inequality is trivially true in the case y = 0, we may assume  is nonzero. Let $$ \lambda$$ be a complex number. Then,


 * $$ 0 \leq \left\| x-\lambda y \right\|^2

= \langle x-\lambda y,x-\lambda y \rangle = \langle x,x \rangle - \bar{\lambda} \langle x,y \rangle - \lambda \langle y,x \rangle + |\lambda|^2 \langle y,y\rangle. $$

Choosing
 * $$ \lambda = \langle x,y \rangle \cdot \langle y,y \rangle^{-1}$$

we obtain


 * $$ 0 \leq \langle x,x \rangle - |\langle x,y \rangle|^2 \cdot \langle y,y \rangle^{-1}$$

which is true if and only if


 * $$ |\langle x,y \rangle|^2 \leq \langle x,x \rangle \cdot \langle y,y \rangle $$

or equivalently:


 * $$ \big| \langle x,y \rangle \big|

\leq \left\|x\right\| \left\|y\right\|, $$

which is the Cauchy-Schwarz inequality.

Rn
In Euclidean space Rn with the standard inner product, the Cauchy-Schwarz inequality is


 * $$\left(\sum_{i=1}^n x_i y_i\right)^2\leq \left(\sum_{i=1}^n x_i^2\right) \left(\sum_{i=1}^n y_i^2\right).$$

In this special case, an alternative proof is as follows: Consider the polynomial in z


 * $$(x_1 z + y_1)^2 + \cdots + (x_n z + y_n)^2 = 0.$$

Note that it's a quadratic polynomial and its discriminant is not greater than zero, because it does not have any roots (unless all the ratios xi/yi are equal), thus
 * $$\left(\sum ( x_i \cdot y_i ) \right)^2 - \sum {x_i^2} \cdot \sum {y_i^2} \le 0$$

which yields the Cauchy-Schwarz inequality.

Also, when n = 2 or 3, the dot product is related to the angle between two vectors and one can immediately see the inequality:


 * $$|x \cdot y| = \|x\| \|y\| | \cos \theta | \le \|x\| \|y\|.$$

Furthermore, in this case the Cauchy–Schwarz inequality can also be deduced from Lagrange's identity. For n = 3, Lagrange's identity takes the form


 * $$\langle x,x\rangle \cdot \langle y,y\rangle = |\langle x,y\rangle|^2 + |x \times y|^2$$

from which readily follows Cauchy-Schwarz.

L2
For the inner product space of square-integrable complex-valued functions, one has


 * $$\left|\int f(x) \overline{g}(x)\,dx\right|^2\leq\int \left|f(x)\right|^2\,dx \cdot \int\left|g(x)\right|^2\,dx.$$

A generalization of this is the Hölder inequality.

Usage
The triangle inequality for the inner product is often shown as a consequence of the Cauchy–Schwarz inequality, as follows: given vectors x and y,


 * $$\|x + y\|^2$$ || $$= \langle x + y, x + y \rangle$$
 * || $$= \|x\|^2 + \langle x, y \rangle + \langle y, x \rangle + \|y\|^2$$
 * || $$\le \|x\|^2 + 2|\langle x, y \rangle| + \|y\|^2$$
 * || $$\le \|x\|^2 + 2\|x\|\|y\| + \|y\|^2$$
 * || $$= \left(\|x\| + \|y\|\right)^2$$
 * }
 * || $$\le \|x\|^2 + 2\|x\|\|y\| + \|y\|^2$$
 * || $$= \left(\|x\| + \|y\|\right)^2$$
 * }
 * }

Taking the square roots gives the triangle inequality.

The Cauchy–Schwarz inequality allows to extend the notion of "angle between two vectors" to any real inner product space, by defining:



\cos\theta_{xy}=\frac{\langle x,y\rangle}{\|x\| \|y\|} $$

The Cauchy–Schwarz inequality proves that this definition is sensible, by showing that the right hand side lies in the interval $$[-1,1]$$, and justifies the notion that real inner product spaces are simply generalizations of the Euclidean space.

The Cauchy–Schwarz is used to prove that the inner product is a continuous function with respect to the topology induced by the inner product itself.

The Cauchy–Schwarz inequality is usually used to show Bessel's inequality.

The general formulation of the Heisenberg uncertainty principle is derived using the Cauchy-Schwarz inequality in the inner product space of physical wave functions.

Generalizations
Various generalizations of the Cauchy-Schwarz inequality exist in the context of operator theory, e.g. for operator-convex functions, and operator algebras, where the domain and/or range of &phi; are replaced by a C*-algebra or W*-algebra.

This section lists a few of such inequalities from the operator algebra setting, to give a flavor of results of this type.

Positive functionals on C*- and W*-algebras
One can discuss inner products as positive functionals. Given a Hilbert space L2(m), m being a finite measure, the inner product < &middot;, &middot; > gives rise to a positive functional &phi; by


 * $$\phi (g) = \langle g, 1 \rangle.$$

Since < f,f > &ge; 0, &phi;(f*f) &ge; 0 for all f in L2(m), where f* is pointwise conjugate of f. So &phi; is positive. Conversely every positive functional &phi; gives a corresponding inner product < f, g >&phi; = &phi;(g*f). In this language, the Cauchy-Schwarz inequality becomes


 * $$| \phi(g^*f) |^2 \leq \phi(f^*f) \phi(g^*g) ,$$

which extends verbatim to positive functionals on C*-algebras.

We now give an operator theoretic proof for the Cauchy-Schwarz inequality which passes to the C*-algebra setting. One can see from the proof that the Cauchy-Schwarz inequality is a consequence of the positivity and anti-symmetry inner-product axioms.

Consider the positive matrix



M = \begin{bmatrix} f^*\\ g^* \end{bmatrix} \begin{bmatrix} f & g \end{bmatrix} = \begin{bmatrix} f^*f & f^* g \\ g^*f & g^*g \end{bmatrix}. $$

Since &phi; is a positive linear map whose range, the complex numbers C, is a commutative C*-algebra, &phi; is completely positive. Therefore



M' = (I_2 \otimes \phi)(M) = \begin{bmatrix} \phi(f^*f) & \phi(f^* g) \\ \phi(g^*f) & \phi(g^*g) \end{bmatrix} $$

is a positive 2 &times; 2 scalar matrix, which implies it has positive determinant:



\phi(f^*f) \phi(g^*g) - | \phi(g^*f) |^2 \geq 0 \quad \mbox{i.e.} \quad \phi(f^*f) \phi(g^*g) \geq | \phi(g^*f) |^2. $$

This is precisely the Cauchy-Schwarz inequality. If f and g are elements of a C*-algebra, f* and g* denote their respective adjoints.

We can also deduce from above that every positive linear functional is bounded, corresponding to the fact that the inner product is jointly continuous.

Positive maps
Positive functionals are special cases of positive maps. A linear map &Phi; between C*-algebras is said to be a positive map if a &ge; 0 implies &Phi;(a) &ge; 0. It is natural to ask whether inequalities of Schwarz-type exist for positive maps. In this more general setting, usually additional assumptions are needed to obtain such results.

Kadison's inequality
One such inequality is the following:

Theorem If &Phi; is a unital positive map, then for every normal element a in its domain, we have &Phi;(a*a) &ge; &Phi;(a*)&Phi;(a) and &Phi;(a*a) &ge; &Phi;(a)&Phi;(a*).

This extends the fact &phi;(a*a) &middot; 1 &ge; &phi;(a)*&phi;(a) = |&phi;(a)|2, when &phi; is a linear functional.

The case when a is self-adjoint, i.e. a = a*, is known as Kadison's inequality.

2-positive maps
When &Phi; is 2-positive, a stronger assumption than merely positive, one has something that looks very similar to the original Cauchy-Schwarz inequality:

Theorem (Modified Schwarz inequality for 2-positive maps) For a 2-positive map &Phi; between C*-algebras, for all a, b in its domain,


 * i) &Phi;(a)*&Phi;(a) &le; ||&Phi;(1)|| &Phi;(a*a).


 * ii) ||&Phi;(a*b)||2 &le; ||&Phi;(a*a)|| &middot; ||&Phi;(b*b)||.

A simple argument for ii) is as follows. Consider the positive matrix



M= \begin{bmatrix} a^* & 0 \\ b^* & 0 \end{bmatrix} \begin{bmatrix} a & b \\ 0 & 0 \end{bmatrix} = \begin{bmatrix} a^*a & a^* b \\ b^*a & b^*b \end{bmatrix}. $$

By 2-positivity of &Phi;,



(I_2 \otimes \Phi) M = \begin{bmatrix} \Phi(a^*a) & \Phi(a^* b) \\ \Phi(b^*a) & \Phi(b^*b) \end{bmatrix} $$

is positive. The desired inequality then follows from the properties of positive 2 &times; 2 (operator) matrices.

Part i) is analogous. One can replace the matrix $$\begin{bmatrix} a & b \\ 0 & 0 \end{bmatrix}$$ by  $$\begin{bmatrix} 1 & a \\ 0 & 0 \end{bmatrix}.$$