Randomness


 * For other meanings, see Random (disambiguation).

The word random is used to express apparent lack of purpose, cause, or order. The term randomness is often used synonymously with a number of measurable statistical properties, such as lack of bias or correlation.

Randomness has an important place in science and philosophy.

History
Humankind has been concerned with randomness since prehistoric times, mostly through divination (reading messages in random patterns) and gambling. The opposition between free will and determinism has been a divisive issue in philosophy and theology.

Despite the prevalence of gambling in all times and cultures, for a long time there was little western inquiry into the subject, possibly due to the Church's disapproval of gambling and divination. Though Gerolamo Cardano and Galileo have written about games of chance, it was work by Blaise Pascal, Pierre de Fermat and Christiaan Huygens that led to what is today known as probability theory.

Mathematicians focused at first on statistical randomness and considered block frequencies (that is, not only the frequencies of occurrences of individual elements, but also those of blocks of arbitrary length) as the measure of randomness, an approach that extended into the use of information entropy in information theory.

In the early 1960s Gregory Chaitin, Andrey Kolmogorov and Ray Solomonoff introduced the notion of algorithmic randomness, in which the randomness of a sequence represents whether it is easy to compress.

Randomness versus unpredictability
Randomness should not be confused with practical unpredictability, which is a related idea in ordinary usage. Some mathematical systems, for example, could be seen as random; however they are actually unpredictable. This is due to sensitive dependence on initial conditions (see chaos theory). Many random phenomena may exhibit organized features at some levels. For example, while the average rate of increase in the human population is quite predictable, in the short term, the actual timing of individual births and deaths cannot be predicted. This small-scale randomness is found in almost all real-world systems. Ohm's law and the kinetic theory of gases are statistically reliable descriptions of the 'sum' (i.e. the net result or integration) of vast numbers of individual micro events, each of which are random, and none of which are individually predictable. (Theoretically the micro-events of gases, for example, could be predicted if the exact position, velocity, atomic composition, angular momentum, and so on of each particle were known.) All we directly perceive is circuit noise and some bulk gas behaviors.

It is important to note that chaotic systems are only unpredictable in practice due to their extreme dependence on initial conditions. Whether or not they are unpredictable in terms of computability theory is a subject of current research. At least in some disciplines computability theory the notion of randomness turns out to be identified with computational unpredictability.

Unpredictability is required in some applications, such as the many uses of random numbers in cryptography. In other applications (e.g. modeling or simulation) statistical randomness is essential, but predictability is also required (for instance, when repeatedly running simulations or acceptance tests, it can be useful to be able to rerun the model with the exact same random input several times).

Sensibly dealing with randomness is a hard problem in modern science, mathematics, psychology and philosophy. Merely defining it adequately, for the purposes of one discipline has proven quite difficult. Distinguishing between apparent randomness and actual randomness has been no easier. In addition, assuring unpredictability, especially against a well-motivated party (in cryptographic parlance, the "adversary"), has been harder still.

Some philosophers have argued that there is no randomness in the universe, only unpredictability. Others find the distinction meaningless (see determinism for more information).

Misconceptions/logical fallacies
Popular perceptions of randomness are frequently wrong, based on logical fallacies. Following is an attempt to identify the source of such fallacies and correct the logical errors. For a more detailed discussion, see Gambler's Fallacy.

A number is "due"
This argument says that "since all numbers will eventually come up in a random selection, those that have not come up yet are 'due' and thus more likely to come up soon". This logic is only correct if applied to a system where numbers that come up are removed from the system, such as when playing cards are drawn and not returned to the deck. It's true, for example, that once a jack is removed from the deck, the next draw is less likely to be a jack and more likely to be some other card. However, if the jack is returned to the deck, and the deck is throughly reshuffled, there is an equal chance of drawing a jack or any other card the next time. The same truth applies to any other case where objects are selected independently and nothing is removed from the system after each event, such as a die roll, coin toss or most lottery number selection schemes.

A number is "cursed"
This argument is almost the reverse of the above, and says that numbers which have come up less often in the past will continue to come up less often in the future. A similar "number is 'blessed'" argument might be made saying that numbers which have come up more often in the past are likely to do so in the future. This logic is only valid if the roll is somehow biased and results don't have equal probabilities - for example, with weighted dice. If we know for certain that the roll is fair, then previous events have no influence over future events.

Note that in nature, unexpected or uncertain events rarely occur with perfectly equal frequencies, so learning which events are likely to have higher probability by observing outcomes makes sense. What is fallacious is to apply this logic to systems which are specially designed so that all outcomes are equally likely - such as dice, roulette wheels, and so on.

Study of randomness
Many scientific fields are concerned with randomness :


 * Algorithmic probability
 * Chaos theory
 * Cryptography
 * Game theory
 * Information theory
 * Pattern recognition
 * Probability theory
 * Quantum mechanics
 * Statistics
 * Statistical mechanics

In philosophy
Note that the bias that "everything has a purpose or cause" is actually implicit in the expression "apparent lack of purpose or cause". Humans are always looking for patterns in their experience, and the most basic pattern seems to be cause/effect. This appears to be deeply embedded in the human brain, and perhaps in other animals as well. For example, dogs and cats often have been reported to have apparently made a cause and effect connection that strikes us as amusing or peculiar. (See classical conditioning.) For instance there is a report of a dog who, after a visit to a vet whose clinic had tile floors of a particular kind, refused thereafter to go near such a tiled floor, whether or not it was at a vet's.

It is because of this bias that the absence of a cause seems problematic. See causality.

To solve this 'problem', random events are sometimes said to be caused by chance. Rather than solving the problem of randomness, this opens the gaping hole of defining chance. It is hard to avoid circularity by defining chance in terms of randomness.

Randomness is also quite relevant in discussions of free will, and of a first cause.

In biology
The characteristics of an organism are traditionally said to be due to genetics and environment, but there are also random elements. For example, consider the characteristic of freckles on a person's skin. Their genetic inheritance controls their potential for developing freckles, with this gene linked to the gene for red hair, in this case. Their environment, such as solar exposure, determines how many of these potential freckles are actually present. The location of each individual freckle, however, can neither be predicted by genetics nor solar exposure, so is caused by a random element. Whether this is truly random, or just follows a pattern too complex for us to understand, is not known.

Note that this effect isn't limited to physical characteristics. Sexual orientation also appears to have a random element, for example. In identical twin studies, such twins are more likely to have the same sexual orientation than two randomly chosen individuals in any given population. This correlation is due solely to genetics, if they are adopted and raised in separate environments, and could be either genetics or environment if they are raised in the same environment. However, those identical twins raised in the same environment still do not have a 100% correlation in sexual orientation. In cases where there is a difference in sexual orientation between the two, this must be attributed to a random element. Again, we don't know if this is truly random, or just follows a pattern too complex for us to understand.

In the natural sciences
Traditionally, randomness takes on an operational meaning in natural science: something is apparently random if its cause cannot be determined or controlled. When an experiment is performed and all the control variables are fixed, the remaining variation is ascribed to uncontrolled (ie, 'random') influences. The assumption, again, is that if it were somehow possible to perfectly control all influences, the result of the experiment would be always the same. Therefore, for most of the history of science, randomness has been interpreted in one way or another as ignorance on the part of the observer.

With the advent of quantum mechanics, however, it appears that the world might be irreducibly random. According to the standard interpretations of the theory, it is possible to set up an experiment with total control of all relevant parameters, which will still have a perfectly random outcome. Minority resistance to this idea takes the form of hidden variable theories in which the outcome of the experiment is determined by certain unobservable characteristics (hence the name "hidden variables"). The debate is over whether truly random events exist, or whether events perceived as random are simply following patterns too complex for our cognition ability.

Many physical processes resulting from quantum-mechanical effects are, therefore, believed to be irreducibly random. The best-known example is the timing of radioactive decay events in radioactive substances.

Deviations from randomness are often regarded by parapsychologists as evidence for the theories of parapsychology.

Source of randomness
In his book A New Kind of Science, Stephen Wolfram describes three mechanisms responsible for (apparently) random behaviour in systems :


 * 1) Randomness coming from the environment (for example, brownian motion, but also hardware random number generators)
 * 2) Randomness coming from the initial conditions. This aspect is studied by chaos theory, and is observed in systems whose behaviour is very sensitive to small variations in initial conditions (such as pachinko machines, dice ...).
 * 3) Randomness intrinsically generated by the system. This is also called pseudorandomness, and is the kind used in pseudo-random number generators. There are many algorithms (based on arithmetics or cellular automaton) to generate pseudorandom numbers. The behaviour of the system can be determined by knowing the seed state and the algorithm used. This method is quicker than getting "true" randomness from the environment.

In practice, these sources of randomness often act together.

In mathematics
The mathematical theory of probability arose from attempts to formulate mathematical descriptions of chance events, originally in the context of gambling but soon in connection with situations of interest in physics. Statistics is used to infer the underlying probability distribution of a collection of empirical observations. For the purposes of simulation it is necessary to have a large supply of random numbers, or means to generate them on demand.

Algorithmic information theory studies, among other topics, what constitutes a random sequence. The central idea is that a string of bits is random if and only if it is shorter than any computer program that can produce that string (Chaitin-Kolmogorov randomness) - this basically means that random strings are those that cannot be compressed. Pioneers of this field include Andrey Kolmogorov, Ray Solomonoff, Gregory Chaitin, Anders Martin-Löf, and others.

In communication theory
Successful communication in the real world depends, at the limit, on understanding and successfully minimizing the deleterious effects of assorted interference sources, many of which are apparently random. Such noise imposes performance limits on any communications channel and it was the study of those limits which led Shannon to develop information theory, make fundamental contributions to communication theory, and establish a theoretical grounding for cryptography.

In finance
The Random walk hypothesis considers that asset prices in an organized market evolve at random.

Applications and use of randomness
"Unpredictable" random numbers were first investigated in the context of gambling, and many randomizing devices such as dice, shuffling playing cards, and roulette wheels, were first developed for use in gambling. Fairly produced random numbers are vital to electronic gambling and ways of creating them are sometimes regulated by governmental gaming commissions.

"Random" numbers are also used for non-gambling purposes, both where their use is mathematically important, such as sampling for opinion polls, and in situations where "fairness" is approximated by randomization, such as selecting jurors and military draft lotteries. Computational solutions for some types of problems use random numbers, such as in the Monte Carlo method and in genetic algorithms.

Generating randomness
The many applications of randomness have led to many different methods for generating random data. These methods may vary as to how unpredictable or statistically random they are, and how quickly they can generate random numbers.

Before the advent of computational random number generators, generating large amount of sufficiently random numbers (important in statistics) required a lot of work. Results would sometimes be collected and distributed as random number tables.


 * See also: Randomization

Quotations

 * "God doesn't play dice with the universe." &mdash;Albert Einstein
 * "Random numbers should not be generated with a method chosen at random." &mdash;Donald E. Knuth
 * "The generation of random numbers is too important to be left to chance." &mdash;Robert R. Coveyou, Oak Ridge National Laboratory, 1969
 * "That which is static and repetitive is boring. That which is dynamic and random is confusing.  In between lies art." &mdash;John A. Locke
 * "I laugh at the predictable and crack the pseudorandom." &mdash;Steven Roddis
 * "How dare we speak of the laws of chance? Is not chance the antithesis of all law?" &mdash; Joseph Bertrand, Calcul des probabilités, 1889

Books

 * Randomness by Deborah J. Bennett. Harvard University Press, 1998. ISBN 0674107454
 * The Art of Computer Programming. Vol. 2: Seminumerical Algorithms, 3rd ed. by Donald E. Knuth, Reading, MA: Addison-Wesley, 1997. ISBN 0-201-89684-2
 * Fooled by Randomness, 2nd Ed. by Nassim Nicholas Taleb. Thomson Texere, 2004. ISBN 158799190X