[MD] Theocracy, Secularism, and Democracy

Krimel Krimel at Krimel.com
Sun Aug 15 10:14:05 PDT 2010


[Marsha]
I made this statement before and received no reply.

It is my understanding that entropy is the amount of unknown information.  

It this correct, or not?

[Krimel]
Entropy as a concept arose in the middle of the 1800's with the laws of
thermodynamics. You know matter and energy can neither be created or destroy
only changed in form... 

Entropy in that context can be understood as disorder. Everything is moving
from a state of order to a state of disorder. In all conversions of energy
from one form to another some is lost to heat. That's why you can't build a
perpetual motion machine. 

On earth a constant stream on energy bathes the planet in the form of solar
radition. That constant influx is constantly dissipating but because of the
balance of forces present here it dissipates through often very circuitous
routes. All living things exist as systems for dissipating solar energy
through these convoluted pathways.

Information theory was first proposed by Shannon in the 1948. I am still
struggling with this but basically information is data plus meaning. Data is
difference. Or as some say a difference that makes a difference. Meaning is
reduction in uncertainty.  So information is meaningful data. 

It was quickly realized that information theory could be linked to
thermodynamics. In fact it was determined that thermodynamics is a subset of
information theory and not the other way around. This to my mind makes
information theory pretty frickin' metaphysical since it in effect stands
before physics.

Information entropy relates to the thermodynamic entropy this way. In
physics it is a measure of disorder in information is refers to
compressibility or the lack there of. We use compression in natural
languages by making the most commonly used words in our language very short,
like "a", "an", "the", "she", "he," etc. This allows "messages" (strings of
meaningful data) to be relatively short or compressed. Messages are
compressed information. The length that a particular message has to be in
order to express meaningful data is Shannon entropy. For example, in order
to fully express the number Pi one has to use an infinite string of numbers.
But the equation for Pi compresses that string to just a few characters. 

Or something like that. As I said I am still struggling with this.










More information about the Moq_Discuss mailing list