*Post by yezi*I am really new one. My question is what is difference between Entropy

and Variance of R.V. Are Entropy and Variance equal in providing

information in physics? I try to understand them.

No, there are several reasons why entropy, H, is not equivalent to

variance, V, including:

1. For (1-D) continuous distributions, one has H <= (1/2) log (2 pi e

V). This implies that

(a) variance can be maximum (i.e., V=infinity) when entropy is not, eg,

the Lorentz distribution

p(x) = 1/(1+x^2), and

(b) entropy can be minimum (i.e., H= - infinity) when variance is

finite, eg, a mixutre of two delta-functions

p(x) = (1/2) [ delta(x-a)/2 + delta(x-b) ] .

Hence they are not equivalent as to the degree of uncertainty in a

continuous distribution, as they do not agree with respect to "minimum"

and "maximum" uncertainty.

2. For discrete distributions, such as the colour of a billiard ball,

variance is totally artificial - eg, one must assign arbitrary numbers

to each colour, and the variance will depend on the assignment. In

contrast, entropy is independent of labelling - it is only a function

of the probabilities.

3. If one distribution is "flatter" than another (look up

"majorisation"), then the entropy of the first is greater than the

entropy of the second. However, this ordering property does not hold

for variance. Similarly, entropy is a concave function of probability,

whereas variance is not (this is particularly important as to why

entropy rather than variance appears in thermodynamics).

4. The (Shannon) information gained from a member of mixture of

distributions is the difference of the entropy of the average

distribution and the average of the entropies of the individual

distributions. One cannot define an analogous information measure

using variance that makes sense (essentially for reasons related to

those given above).

5, There are some connections between entropy and variance (other than

the inequality in item 1 above), for *particular classes* of

distributions (eg, Gaussian distributions, for which the inequality

becomes an equality). *If* one restricts attention to such a class,

then there may be a monotonic relation between H and V. In such a

case, one is simply some function of the other, and so they are in some

sense equivalent.