Encyclopedia  |   World Factbook  |   World Flags  |   Reference Tables  |   List of Lists     
   Academic Disciplines  |   Historical Timeline  |   Themed Timelines  |   Biographies  |   How-Tos     
Sponsor by The Tattoo Collection
Randomness
Main Page | See live article | Alphabetical index

Randomness

In ordinary language, the word random is used to express apparent lack of purpose or cause. This suggests that, whatever is causing something, its nature is not only unknown but the consequences of its operation are also unknown. In most technical senses, randomness has an additional positive meaning related to some of the statistical properties of the observed. Thus, the landing location of water droplets from a waterfall will be random in the ordinary sense—who knows just what forces have applied to this or that droplet causing it to fall where it does? But in a statistical sense, droplet landing spots are not distributed randomly at all, depending on the scale of observation. Thus, all droplets are confined to a relatively small area about the base of the fall, and within that area have a distinctly non-uniform distribution. On the other hand, the sound intensity at this instant of any chosen frequency of electrical circuit noise (eg, 'hiss') is technically random as well as conventionally random.

Table of contents
1 Randomness versus unpredictability
2 Randomness in philosophy
3 Randomness in natural science
4 Randomness in mathematics
5 Randomness in practical communications and cryptography
6 Randomness in gaming
7 Randomness in music
8 Quotations
9 Books
10 See also
11 External links

Randomness versus unpredictability

Randomness should not be confused with unpredictability which is a related idea in ordinary usage, but unconnected mathematically, and (for many purposes) in physics. For instance, deterministic chaos deals with random phenomena which exhibit organized features at some levels. As another example, the increase of the world human population is quite predictable on average, but individual births and deaths cannot be accurately predicted with any precision in many cases; this small-scale randomness is found in almost all real-world systems, if not as strikingly. Ohm's law and the kinetic theory of gases are statistically reliable descriptions of the 'sum' (ie, the net result or integration) of vast numbers of individual micro events, each of which are random and none of which are individually predictable. All we directly perceive is circuit noise and some bulk gas behaviors.

In some applications, both randomness (as tested statistically) and unpredictability are required, as for instance in most cryptography uses. In other applications, such as many modeling or simulation applications, unpredictability is not only unnecessary, but may cause problems as for instance whilst repeating modeling runs during model 'acceptance tests'.

Sensibly dealing with randomness is a seriously hard problem in modern science, mathematics, psychology and philosophy. Merely defining it adequately for the purposes of this or that discipline has been quite difficult. Distinguishing between apparent randomness and actual randomness has been no easier, and additionally assuring unpredictability, especially against a well motivated party (in cryptographic parlance, the "Adversary"), has been harder still.

Randomness in philosophy

Note that the bias that "everything has a purpose or cause" is actually implicit in the expression "apparent lack of purpose or cause". Humans are always looking for patterns in their experience, and the most basic pattern seems to be cause/effect. This appears to be deeply embedded in the human brain, and perhaps in other animals as well. For example, dogs and cats often have been reported to have apparently made a cause and effect connection that strikes us as amusing or peculiar. (See classical conditioning). For instance there is a report of a dog who, after a visit to a vet whose clinic had tile floors of a particular kind, refused thereafter to go near such a tiled floor, whether or not it was at a vet's.

It is because of this bias that the absence of a cause seems problematic. See causation.

To solve this 'problem', random events are sometimes said to be caused by chance. Rather than solving the problem of randomness, this opens the gaping hole of the ontological status of chance. It is hard to avoid circularity by defining chance in terms of randomness.

Randomness in natural science

Traditionally, randomness takes on an operational meaning in natural science: something is apparently random if its cause cannot be determined or controlled. When an experiment is performed and all the control variables are fixed, the remaining variation is ascribed to uncontrolled (ie, 'random') influences. The assumption, again, is that if it were somehow possible to perfectly control all influences, the result of the experiment would be always the same. Therefore, for most of the history of science, randomness has been interpreted in one way or another as ignorance on the part of the observer.

With the advent of quantum mechanics, however, it appears that the world might be irreducibly random. According to the standard interpretations of the theory, it is possible (and in fact very, very easy) to set up an experiment with total control of all relevant parameters, which will still have a perfectly random outcome. The resistance to this idea takes the form of hidden variable theories in which the outcome of the experiment is determined by certain unobservable characteristics (hence the name "hidden variables").

Many physical processes resulting from quantum-mechanical effects are, therefore, believed to be irreducibly random. The best-known example is the timing of radioactive decay events in radioactive substances.

Deviations from randomness are often regarded by parapsychologists as evidence for the theories of parapsychology.

Randomness in mathematics

The mathematical theory of probability arose from attempts to formulate mathematical descriptions of chance events, originally in the context of gambling but soon in connection with situations of interest in physics. Statistics is used to deduce the underlying probability distribution of a collection of empirical observations.

For the purposes of simulation it is necessary to have a large supply of random numbers, or means to generate them on demand. Algorithmic information theory studies, among other topics, what constitutes a random sequence. Pioneers of this field include Andrey Kolmogorov, Ray Solomonoff, Gregory Chaitin, Anders Martin-Löf, and others.

Randomness in practical communications and cryptography

Successful communication in the real world depends, at the limit, on understanding and successfully minimizing the deleterious effects of assorted interference sources, many of which are apparently random. Such noise imposes performance limits on any communicaitons channel and it was the study of those limits which led Shannon to develop information theory, make fundamental contributions to communication theory, and establish a theoretical grounding for cryptography.

Access to a source of high-quality randomness is absolutely critical in many applications of cryptography. For example, even a subtly non-random key choice may result in a complete break into a communications channel that was believed to have been secure and was relied upon to be so. See the Enigma machine and one-time pad articles for examples of the consequences of such mis-estimates. Keys used for the Enigma were non-random in many cases which made it possible for Allied cryptanalysts to break into the traffic with substantial consequences for the Nazi war effort. A similar thing happened in the Pacific Theater of WWII with the Japanese 'Purple' machine; its key selection was also insufficiently random. The key material used in the theoretically unbreakable one-time pad must be random and unpredictable lest the encryption technique become trivially breakable. Even a slight predictability of the key material used removes the one-time pad from the unbreakable category. The world's first programmable digital electronic computer was developed to attack a mechanical (and subtly non-random) implementation of the one-time pad.

Randomness in gaming

Randomness is central to games of chance and vital to the gambling industry.

Random draws are often used to make a decision where no rational or fair basis exists for making a deterministic decision.

Randomness in music

Randomness in music is deemed postmodern, including John Cage's chance derived Music of Changes, Iannis Xenakis' stochastic music, aleatoric music, indeterminate music, or generative music.

Quotations

Books

See also

External links