| In fact, if all the microscopic physical processes are reversible (see discussion below), then the Second Law of Thermodynamics can be proven for any isolated system of particles with initial conditions in which the particles states are uncorrelated. To do this, one must acknowledge the difference between the measured entropy of a system—which depends only on its [[macrostate]] (its volume, temperature etc.)—and its [[information entropy]],<ref>''Physical Origins of Time Asymmetry'', p. 35.</ref> which is the amount of information (number of computer bits) needed to describe the exact [[microstate (statistical mechanics)|microstate]] of the system. The measured entropy is independent of correlations between particles in the system, because they do not affect its macrostate, but the information entropy '''does''' depend on them, because correlations lower the randomness of the system and thus lowers the amount of information needed to describe it.<ref>''Physical Origins of Time Asymmetry'', pp. 35-38.</ref> Therefore, in the absence of such correlations the two entropies are identical, but otherwise the information entropy is smaller than the measured entropy, and the difference can be used as a measure of the amount of correlations. | | In fact, if all the microscopic physical processes are reversible (see discussion below), then the Second Law of Thermodynamics can be proven for any isolated system of particles with initial conditions in which the particles states are uncorrelated. To do this, one must acknowledge the difference between the measured entropy of a system—which depends only on its [[macrostate]] (its volume, temperature etc.)—and its [[information entropy]],<ref>''Physical Origins of Time Asymmetry'', p. 35.</ref> which is the amount of information (number of computer bits) needed to describe the exact [[microstate (statistical mechanics)|microstate]] of the system. The measured entropy is independent of correlations between particles in the system, because they do not affect its macrostate, but the information entropy '''does''' depend on them, because correlations lower the randomness of the system and thus lowers the amount of information needed to describe it.<ref>''Physical Origins of Time Asymmetry'', pp. 35-38.</ref> Therefore, in the absence of such correlations the two entropies are identical, but otherwise the information entropy is smaller than the measured entropy, and the difference can be used as a measure of the amount of correlations. |