The assignment of adequate probabilities to the basic
outcomes of some random experiment under consideration is a
prerequisite of a probabilistic description. The maximum entropy
principle of Statistical Mechanics solves this problem by stating that
among all probability distributions that satisfy some given constraints
the one of maximum Boltzmann-Gibbs-Shannon entropy must be chosen.
Being an indisputable principle of Statistical Mechanics,
its status outside physics as a general method of statistical
inference is less clear.
We present a foundation of the maximum entropy principle that is based
on probabilistic arguments. It deliberately avoids reference to
notions of information and it does not appeal to physically motivated
properties of entropy. This is achieved by formalizing probability
assignment as a certain symmetric, continuous mapping that additionally
satisfies a condition of self-consistency. We show that any such mapping
can be expressed as a variational principle, i.e. as a generalized
maximum entropy principle with an a priori unspecified entropy function.
Adding a condition of statistical independence, an earlier result of
Shore and Johnson shows that the entropy function must be essentially
the standard Boltzmann-Gibbs-Shannon entropy.