Skip to main content

An important extensive property of all systems in thermodynamics, statistical mechanics, and information theory, quantifying their disorder (randomness), i.e., our lack of information about them. It characterizes the degree to which the energy of the system is *not* available to do useful work.

Entropy is defined to be extensive (that is additive when two systems are considered together).

In statistical mechanics the entropy is identified with the logarithm of the number of microscopic states of a system which has a given set of macroscopic properties.