Tuesday, February 8, 2011

What is entropy?

Entropy is a measurement of randomness.

Statistically, entropy (represented by S) is the logarithm of the number of microstates that describe a macrostate. For this specific formula to work, the probability that a system is in each state should be equal.

An example: the entropy of a binary system such as a coin or electron would be log(2) because there are two microstates: a head and a tail.

Technically,
S = k ln W

ln is the natural log and k is some constant which has units of J/K

No comments:

Post a Comment