Friday, 20 February 2009


I've been meaning to post something interesting about stat-mech about once a fortnight and so far I'm not doing so well. For today I thought I'd share my perspective on entropy.

If you ask the (educated) person in the street what entropy is they might say something like "it's a measure of disorder". This is not a bad description, although it's not exactly how I think about it. As a statistical mechanition I tend to think of entropy in a slightly different way to say, my Dad. He's an engineer and as such he thinks of entropy more in terms of the second law of thermodynamics. This is also a good way of thinking about it, but here's mine.

Consider two pictures, I can't be bothered making them (EDIT: see this post, the T=2,3 pictures) so you can just imagine them. First imagine a frozen image of the static on your television, and secondly imagine a white screen. On the basis of the disorder description you might say that the static, looking more disordered, has a higher entropy. However, this is not the case. These are just pictures, and there is one of each, so who is to say which is more disordered?

Entropy does not apply to single pictures, it applies to 'states'. A state, in the thermodynamic sense, is a group of pictures that share some property. So for the static we'll say that the property is that there are roughly as many white pixels as black pixels with no significant correlations and for the white screen we'll say it's all pixels the same colour. The entropy of a state is the number of pictures (strictly it's proportional to the logarithm of this) that fit its description.

For our blank screen it's easy, there are only two pictures, all black or all white. For the static there are a bewildering number of pictures that fit the description. So many that you'll never see the same screen of static twice, for a standard 720x480 screen it'll be something like 10 to the power 100,000*.

So it's the disordered state, all those pictures of static that look roughly the same, that has the high entropy. If we assume that each pixel at any time is randomly (and independently) black or white, then it's clear why you never see a white screen in the static - it's simply out gunned by the stupidly large number of jumbled up screens.

In a similar way a liquid has a higher entropy than a crystal (most of the time, there is one exception), there are more ways for a load of jumbled up particles to look like a liquid than the structured, ordered crystal. So why then does water freeze? This, as you might guess, comes down to energy.

Water molecules like to line up in a particular way that lowers their energy. When temperature is low then energy is the most important thing and the particles will align on a macroscopic scale to make ice. When temperature is high entropy becomes more important, those nice crystalline configurations are washed out by the shear number of liquid configurations.

And this is essentially why matter exists in different phases, it's a constant battle between entropy and energy and depending which wins we will see very different results.

I'll try and update with some links to better descriptions soon.

*this number is only as accurate as my bad definition of the disordered state.