in , ,

Physics 101: Heads or Tails?

Photo by David Rives.

I bought a sweatshirt a few months ago, and despite the fact that it’s the teeniest bit too small, I couldn’t love it more. It’s navy blue and branded with the phrase “Entropy: It ain’t what it used to be.” For nerds like me, that’s hilarious…and clever. It’s a spin on the second law of thermodynamics, which says: for a closed system, entropy always increases. In other words, entropy is always changing, or it ain’t what it used to be. L.O.L.

That was enough to make me want to buy it, but that phrase also had a second layer of meaning for me as well. I was taking a thermal physics class, and one of the most important things we’d been discussing was entropy. Most people know entropy as disorder and chaos—the gradual decay of everything sane and orderly. I thought so too, until I learned different. Even in a previous post, I said: “…every time you see the word entropy, you can replace it with the word disorder.” But entropy is more that than that. It’s related to probability, temperature, and information in the craziest ways.

So when I looked at the phrase on that sweatshirt, it was more than just a clever saying. For me, entropy really wasn’t what it used to be.

Advertisement Below:

So, what is it now? Glad you asked.

As I said, entropy is associated with disorder and chaos, and it’s also related to probability. So let’s continue in that vein and start with some basic probability. If you flip a coin, there are two possible outcomes: heads or tails. The probability of getting either heads (H) or tails (T) is 1/2. Now let’s say we flip two coins. There are four possible outcomes: HH HT TH TT. The probability of getting HH or TT is 1/4, while the probability of getting one head and one tail is 2/4 or 1/2. The outcomes are called microstates, and the type of microstate is called a macrostate. So in the second example, there are 4 microstates: HH HT TH TT, and 3 macrostates: all heads, all tails, and head and tail. If it helps, think of microstates as combinations and macrostate as the type of combination.

Microstates and macrostates work together to define multiplicity, which is the number of microstates, or combinations, in a particular macrostate. Again, in the second example, there is one microstate (HH) in the all heads macrostate, but there are two microstates (HT TH) in the head and tail macrostate. If we had flipped three coins, there would still be only one microstate (HHH) in the all heads macrostate, but there would be more microstates, or more combinations, of heads and tails.

So what does this have to do with entropy? Well, entropy is defined by multiplicity. The formal definition is:

S = k ln Ω

Where S is entropy, k is a number called the Boltzmann constant, and Ω is the multiplicity. So if you know the multiplicity of a certain system, you can figure out its entropy. How cool is that?

Entropy has units of energy per degree of temperature, Joules per Kelvin to be exact, but in terms of flipping coins, this doesn’t make a whole lot of sense. Coins have nothing to do with energy or temperature. However, there’s a very clear and intriguing connection between entropy and systems outside of thermodynamics. You can think of a coin flip analogous to a bit of information (i.e., 0 or 1) and think about the entropy contained in a message or a string of computer information.

Advertisement Below:

This is why I think that defining entropy strictly as disorder or decay is unfair. We think of information, technology, and computers as order, progress, and the product of intelligence. Yet, information is very closely related to entropy. I’m not well versed on this subject, but if you’re interested, I highly recommend researching the relationships between entropy and information theory. It’s mind blowing.

And I think there are interesting connections between information theory and intelligent design. The information contained in DNA is more than enormous, and there’s no explanation for atheists as to how it could have evolved. I know that Drs. Stephen C. Meyer and William Dembski have done good work on information theory and complex systems. As I learn more, I think it’d be worth looking into their research and trying to see where entropy comes in.

However, the best is yet to come. Entropy and thermodynamics have a more tangible connection—one that I find even more fascinating. But it will have to wait. Next time, I’ll make the connection between entropy and temperature, and we’ll see how entropy explains the second law of thermodynamics.

Avatar photo

Written by Rachel Hamburg

I'm just a plain old physics student who loves learning about the world around me. I hope to obtain a graduate degree in nonlinear dynamics or a related field and eventually do a bit of research. Although physics is my first love, I also enjoy folk dancing, ping-pong, and reading British literature.

Advertisement Below:

Comments

Leave a Reply
  1. You have a real ability to take a complex subject and break it down into easy to understand language. As a non-scientific person, I enjoy your writing due to the fact that I can grasp a new concept and gain insight into the meaning of that topic. You also pique me desire to dig deeper and study a topic further. Please continue sharing!

Leave a Reply to Beverly Burris Cancel reply

Your email address will not be published. Required fields are marked *

Loading…

0
Advertisement Below:
Advertisement Below:

What does Creation teach us about the sanctity of life that Darwinism Cannot?

It doesn’t take a Rocket Scientist