Discussion:
Entropy vs. Variance
(too old to reply)
yezi
2006-01-18 19:34:15 UTC
Permalink
Hi:

I am really new one. My question is what is difference between Entropy
and Variance of R.V. Are Entropy and Variance equal in providing
information in physics? I try to understand them.

Thanks for any comments.
a student
2006-01-21 10:50:44 UTC
Permalink
Post by yezi
I am really new one. My question is what is difference between Entropy
and Variance of R.V. Are Entropy and Variance equal in providing
information in physics? I try to understand them.
No, there are several reasons why entropy, H, is not equivalent to
variance, V, including:

1. For (1-D) continuous distributions, one has H <= (1/2) log (2 pi e
V). This implies that
(a) variance can be maximum (i.e., V=infinity) when entropy is not, eg,
the Lorentz distribution
p(x) = 1/(1+x^2), and
(b) entropy can be minimum (i.e., H= - infinity) when variance is
finite, eg, a mixutre of two delta-functions
p(x) = (1/2) [ delta(x-a)/2 + delta(x-b) ] .
Hence they are not equivalent as to the degree of uncertainty in a
continuous distribution, as they do not agree with respect to "minimum"
and "maximum" uncertainty.

2. For discrete distributions, such as the colour of a billiard ball,
variance is totally artificial - eg, one must assign arbitrary numbers
to each colour, and the variance will depend on the assignment. In
contrast, entropy is independent of labelling - it is only a function
of the probabilities.

3. If one distribution is "flatter" than another (look up
"majorisation"), then the entropy of the first is greater than the
entropy of the second. However, this ordering property does not hold
for variance. Similarly, entropy is a concave function of probability,
whereas variance is not (this is particularly important as to why
entropy rather than variance appears in thermodynamics).

4. The (Shannon) information gained from a member of mixture of
distributions is the difference of the entropy of the average
distribution and the average of the entropies of the individual
distributions. One cannot define an analogous information measure
using variance that makes sense (essentially for reasons related to
those given above).

5, There are some connections between entropy and variance (other than
the inequality in item 1 above), for *particular classes* of
distributions (eg, Gaussian distributions, for which the inequality
becomes an equality). *If* one restricts attention to such a class,
then there may be a monotonic relation between H and V. In such a
case, one is simply some function of the other, and so they are in some
sense equivalent.
Igor Khavkine
2006-01-21 10:50:49 UTC
Permalink
Post by yezi
I am really new one. My question is what is difference between Entropy
and Variance of R.V. Are Entropy and Variance equal in providing
information in physics? I try to understand them.
I presume that by R.V. you mean Random Variable. I can't answer your
question in full generality, simply because to make any statement about
physics, one must associate a random variable to an observable quantity
(like energy, momentum, position, etc.). This makes sense under various
circumstances, e.g. when thermal or quantum fluctuations are present.

In the physics of macroscopic objects, entropy is actually one of these
measurable quantities. Roughly, it represents the number of degrees of
freedom of a macroscopic body that are hidden from observation. Or, in
other words, those degrees of freedom that are not restricted to
constant values under macroscopic equilibrium. As an observable, its
value can be treated as a random variable with a certain distribution
enduced by thermal fluctuations. As a random variable, its mean,
variance, and other moments can be calculated, just for any other
observable.

So, to finally answer your question, entropy is and variance and
orthogonal concepts in physics. However, I'm getting an inkling that
you might have meant something different by the term "entropy". If
you're more specific, someone may provide a better answer.

Hope this helps.

Igor
yezi
2006-01-23 22:08:35 UTC
Permalink
Thanks for comment. Actually I am trying to understand this observation
which described as following:

The order which transmitted in the network will be changed. To
understand how much the order is altered, I need to find some suitable
parameters to describe it. In this case, the order of packet is the
R.V. . what I am considering is there is similarity between this
observation and the physics.

Since Variance can character how spread the R.V is, I suspect in this
specific case the entropy is same as Variance in physics meaning. I try
to find some good reason to chose one of them.

i***@gmail.com
2006-01-21 10:53:07 UTC
Permalink
Provided that the vsystem is LINEAR, uncerainties, or information,
contained within a system can be expressed in the form of variances
(having diagonalised your matrices). If the system is non linear
variance will NOT be the same thing as entropy.

We can express the 2nd law of T in terms of temperatures. deltaS =
deltaQ/T and express maximum efficiencies for engines/refigerators
operating between different temperatures. We may also express this
statistically and we find, yes indeed, that variance and entropy are
related. Shannon proposed a measure of information that is the same as
entropy. Entropy refers to disorder, information to order.

http://groups.google.co.uk/group/comp.ai.nat-lang/browse_frm/thread/a068a4ed46db1975?hl=en

refers to information. Decoding, getting the minimum Kolmogorov bit
length can be a non trivial problem. Entropy refers to minimum K
content. Orthogonal variance simply means tthat we have eigenvalues,
with no underlying complications in structure.
Loading...