Assessment |
Biopsychology |
Comparative |
Cognitive |
Developmental |
Language |
Individual differences |
Personality |
Philosophy |
Social |

Methods |
Statistics |
Clinical |
Educational |
Industrial |
Professional items |
World psychology |

**Statistics:**
Scientific method ·
Research methods ·
Experimental design ·
Undergraduate statistics courses ·
Statistical tests ·
Game theory ·
Decision theory

In mathematical physics, especially as introduced into statistical mechanics and thermodynamics by J. Willard Gibbs in 1878, an **ensemble** (also **statistical ensemble** or **thermodynamic ensemble**) is an idealization consisting of a large number of mental copies (possibly infinitely many) of a system, considered all at once, each of which represents a possible state that the real system might be in.

The ensemble formalises the notion that a physicist can imagine repeating an experiment again and again under the same macroscopic conditions, but, unable to control the microscopic details, may expect to observe a range of different outcomes.

When an ensemble has an infinite number of members, it can be seen as defining a probability measure on the state space (phase space) of the system. Even though the dynamics of the real single system (for example, a complete gas of molecules, or a complete stockmarket) may be uncalculably complex, or stochastic, or even discontinuous, the *average* (statistical) properties of the ensemble of possibilities as a whole may remain well defined, smoothly evolving, or for systems at macroscopic equilibrium even stationary.

The word **ensemble** is also sometimes used for smaller sets of possibilities, sampled from the full set of possible states. Thus for example, an ensemble of walkers in a Markov chain Monte Carlo iteration; or an ensemble forecast in meteorology, where a whole ensemble of possible initial states is projected forwards, to try to give an idea of the spread of possible forecast outcomes; or climate ensembles, where the space of macroscopically possible perturbations of the model physics is also sampled.

The notional size of the mental ensembles in thermodynamics, statistical mechanics and quantum statistical mechanics can be very large indeed, to include every possible microscopic state the system could be in, consistent with its observed macroscopic properties. But for important physical cases it can be possible, by clever mathematical manipulations, to calculate averages directly over the whole of the thermodynamic ensemble, to obtain explicit formulas for many of the thermodynamic quantities of interest, often in terms of the partition function *Z*, which encodes the underlying physical structure of the system. Some of these results are presented in the article Statistical mechanics.

If the system states are *perfectly mixing* (ergodic), and the ensemble is large and corresponds to a probability measure which is *invariant* under this dynamics, then the time-average of a quantity taken over a sufficiently long time for a single real evolving system *should* be well predicted by the ensemble average, averaged over the members of the ensemble as a whole -- ie the average value for an instant observation notionally repeated in a lot of different systems. This is known as the ergodic hypothesis. If not, one may infer that there is more macroscopically discoverable information available than one first thought about the microscopic state of the system, which may be usable to create a better-conditioned ensemble.

The word *ensemble* is particularly used in thermodynamics; by some physicists working in Bayesian probability theory; and by mathematicians whose work in probability theory is heavily influenced by physicists, especially those working on random matrices. Most "pure" mathematicians working in probability theory do not use the term, preferring to use the terminology of probability spaces.

## Principal ensembles of statistical thermodynamics Edit

Different macroscopic environmental constraints lead to different types of ensembles, with particular statistical characteristics. The following are the most important:

- Microcanonical ensemble -- an ensemble of systems, each of which is required to have the same total energy (ie thermally isolated).

- Canonical ensemble -- an ensemble of systems, each of which can share its energy with a large heat reservoir or heat bath (in effect, fixing the temperature).

- Grand canonical ensemble -- an ensemble of systems, each of which can share both its energy and its particles with a reservoir (ie an open system, at a given temperature).

The calculations which can be made over each of these ensembles are explored further in the article Statistical mechanics. The main result for each ensemble however, is its characteristic state function:

Microcanonical: $ \Omega(U,V,N) = e^{\beta TS} \;\, $

Canonical: $ Z(T,V,N) = e^{- \beta A} \,\; $

Grand canonical: $ \Xi(T,V,\mu) = e^{\beta P V} \,\; $

Other thermodynamic ensembles can be also defined, corresponding to different physical requirements, for which analogous formulae can often similarly be derived.

## Properties of "good" ensembles Edit

- Representativeness

pdf over the states in the ensemble should reflect their equilibrium pdf. Gibbs state.

- Ergodicity

Time-averages of macroscopic quantities of interest will only have a chance of being the same as ensemble-averages, if the system evolving over time can actively explore close to all of the state space possibilities included in the ensemble (**ergodicity**). Otherwise the probability density over the ensemble will not be representative of the probability density of states in the time-evolution. (See ergodic hypothesis).

## Ensembles in quantum statistical mechanics Edit

Template:TopicInQuantum-theory
*See main article: Quantum statistical mechanics*

Putting aside for the moment the question of how statistical ensembles are generated operationally, we should be able to perform the following two operations on ensembles *A*, *B* of the same system:

- Test whether
*A*,*B*are statistically equivalent.

- If
*p*is a real number such that 0 <*p*< 1, then produce a new ensemble by probabilistic sampling from*A*with probability*p*and from*B*with probability 1-*p*.

Under certain conditions therefore, equivalence classes of statistical ensembles have the structure of a convex set. In quantum physics, a general model for this convex set is the set of density operators on a Hilbert space. Accordingly, there are two types of ensembles:

*Pure ensembles*cannot be decomposed as a convex combination of different ensembles. In quantum mechanics, a pure density matrix is one of the form $ |\phi \rangle \langle \phi| $. Accordingly, a ray in a Hilbert space can be used to represent such an ensemble in quantum mechanics. A pure ensemble corresponds to having many copies of the same (up to a global phase) quantum state.*Mixed ensembles*are decomposable into a convex combination of different ensembles. In general, an infinite number of distinct decompositions will be possible.

## Operational interpretation Edit

Two objections to the above discussion of ensemble are

- It is not clear where this
*very large set of systems*exists (for example, is it a*gas*of particles inside a container?)

- It is not clear how to physically generate an ensemble.

In this section we attempt to partially answer this question.

Suppose we have a *preparation procedure* for a system in a physics
lab: For example, the procedure might involve a physical apparatus and
some protocols for manipulating the apparatus. As a result of this preparation procedure some system
is produced and maintained in isolation for some small period of time.
By repeating this laboratory preparation procedure we obtain a
sequence of systems *X*_{1}, *X*_{2},
....,*X*_{k}, which in our mathematical idealization, we assume is an infinite sequence of systems. The systems are similar in that they were all produced in the same way. This infinite sequence is an ensemble.

In a laboratory setting, each one of these prepped systems might be used as input
for *one* subsequent *testing procedure*. Again, the testing procedure
involves a physical apparatus and some protocols; as a result of the
testing procedure we obtain a *yes* or *no* answer.
Given a testing procedure *E* applied to each prepared system, we obtain a sequence of values
Meas(*E*, *X*_{1}), Meas(*E*, *X*_{2}),
...., Meas(*E*, *X*_{k}). Each one of these values is a 0 (or no) or a 1 (yes).

Assume the following time average exists:

- $ \sigma(E) = \lim_{N \rightarrow \infty} \frac{1}{N} \sum_{k=1}^N \operatorname{Meas}(E, X_k) $

For quantum mechanical systems, an important assumption made in the
quantum logic approach to quantum mechanics is the identification of *yes-no* questions to the
lattice of closed subspaces of a Hilbert space. With some additional
technical assumptions one can then infer that states are given by
density operators *S* so that:

- $ \sigma(E) = \operatorname{Tr}(E S). $

## ErgodicityEdit

**Ergodocity** is the condition which guarantees that the average of a macroscopic quantity (such as the entropy or internal energy) over the members of the ensemble will be the same as the average over time, for a single system (see ergodic hypothesis).

## See alsoEdit

- density matrix
- Partition function (statistical mechanics)
- microcanonical ensemble
- canonical ensemble
- grand canonical ensemble
- isothermal-isobaric ensemble
- phase space
- Liouville's theorem (Hamiltonian)he:צבר (מכניקה סטטיסטית)

This page uses Creative Commons Licensed content from Wikipedia (view authors). |