- Conference date: 4-9 August 2001
- Location: Baltimore, Maryland (USA)
One of the major objections to the uses of Bayesian Methods and Maximum Entropy Methods is associated with the choice of the prior distribution, especially when there is no prior information available. LaPlace, who made excellent use of Bayesian methods to obtain significant results, originally postulated that the uninformative prior should be constant, although he later came to question the use of this prior. One of the objections to the use of a constant prior is that it is not independent of variable transformations: the thought being that if a prior is truly uninformative, then any transformation of the variable should also be uninformative. Later, Jeffreys,using heuristic arguments, postulated that a better uninformative prior is the 1/σ function. Jaynes provided a more formal derivation of the 1/σ prior for certain classes of probability functions. The 1/σ prior has the desirable feature of being independent of integer-power variable transformations, although it is not a probability distribution, and is therefore called an improper prior. Unfortunately, the 1/σ prior gives extreme weighting to low values of σ. Shannon, while working at Bell Telephone Laboratories, derived Statistical Entropy, as a measure of uncertainty in a distribution. Shannon’s work leads to a constant prior, in the uninformative case. Unfortunately, Shannon’s Entropy holds only for discrete variable distributions. In this paper, an uninformative prior in the continuous variable case is developed, which is independent of virtually any variable transformation and yet does not give preferential weight to any value of the continuous variable. A simple example is used to show how the results differ for the new prior and the 1/σ prior.
- Information and communication theory
- Information theory entropy
- Probability theory
Data & Media loading...
Article metrics loading...