Tuesday, December 07, 2004
falling eggs, cooling coffee and blabber talk
Ignoring the definition according to the free encyclopedia people (as was referenced by vaya)
I think that Information Entropy should really be a measure for the amount of nonsense-talk in a normal conversation between people. This brings the definition close to entropy in physics.
The concept of entropy has always fascinated me. Entropy can loosely be defined as a the amount of chaos, or disorder. The (in)famous second law of thermodynamics says that entropy increases with time in a closed environment.
A well known example is that a cup of black coffee with no sugar cools down, and the environment around it gets a bit warmer. This is increased entropy in the system as a whole. The lukewarm state of everything (air and coffee) is more chaotic (more disorderly, less distinct) than the initial state (hot coffee and cool air).
Phrased in an easy way, the second law of thermodynamics says that a cup of black coffee never suddenly starts boiling, whilst the environment around it cools down a bit (and hey, we take this stuff for granted, you know).
Footnote: If you want more examples, then enjoy the great lyric by rapper Prof. Steven Hawking (yeah that is the guy in the wheelchair). I actually recommend the entire site
Now... I postulate that the second law of thermodynamics reversely applies to Informational Entropy: Confusion and miscommunication decrease in a closed conversation with time. The more you talk, the less chaotic it gets.
I have strong empirical evidence for this observation: if you talk long enough with me, you'll end up discussing only the most complex topic of them all: sex.
Hold on, isn't sex the simplest topic of them all?
Hmmm... can be complex and can be simple... ehh... well... duh... ok-lah... Wait. I gottit, let's just have sex. Noh, no wait... Let's at least talk about it...
I think that Information Entropy should really be a measure for the amount of nonsense-talk in a normal conversation between people. This brings the definition close to entropy in physics.
The concept of entropy has always fascinated me. Entropy can loosely be defined as a the amount of chaos, or disorder. The (in)famous second law of thermodynamics says that entropy increases with time in a closed environment.
A well known example is that a cup of black coffee with no sugar cools down, and the environment around it gets a bit warmer. This is increased entropy in the system as a whole. The lukewarm state of everything (air and coffee) is more chaotic (more disorderly, less distinct) than the initial state (hot coffee and cool air).
Phrased in an easy way, the second law of thermodynamics says that a cup of black coffee never suddenly starts boiling, whilst the environment around it cools down a bit (and hey, we take this stuff for granted, you know).
Footnote: If you want more examples, then enjoy the great lyric by rapper Prof. Steven Hawking (yeah that is the guy in the wheelchair). I actually recommend the entire site
Now... I postulate that the second law of thermodynamics reversely applies to Informational Entropy: Confusion and miscommunication decrease in a closed conversation with time. The more you talk, the less chaotic it gets.
I have strong empirical evidence for this observation: if you talk long enough with me, you'll end up discussing only the most complex topic of them all: sex.
Hold on, isn't sex the simplest topic of them all?
Hmmm... can be complex and can be simple... ehh... well... duh... ok-lah... Wait. I gottit, let's just have sex. Noh, no wait... Let's at least talk about it...