« BackWhat Is Entropy?arxiv.orgSubmitted by sebg 3 days ago
  • Citizen_Lame 2 days ago

    Entropy can also be understood as uniformity.

    For example, when you add a bit of milk to a cup of black coffee, at first you'll see the white milk and black coffee as separate. But with a single stir (or just over time), they mix together and become a uniform blend—the entropy has increased, the uniformity has grown. You can't just stir it the opposite way and magically separate the milk from the coffee again.

    Over time, an abandoned house gradually turns into dust and rubble—a more uniform state. Entropy, or this increasing uniformity, always grows over time. To me, uniformity feels more "ordered," so in that sense, entropy is a measure of order, or how mixed things are.

    • raattgift 2 days ago

      Crystals are very uniform and yet have very low entropy: at 0 K the entropy of a perfect crystal is zero. See <https://en.wikipedia.org/wiki/Third_law_of_thermodynamics#Ex...>.

      Your second paragraph is not a million miles off from Boltzmannian entropy, the log relationship between a macrostate and the set of microstates that match the macrostate. A very carefully layered milk-on-coffee matches no microstates in which there is milk at the bottom of the arrangement or coffee at the top. A fully stirred mix can have a milk molecule and a coffee molecule anywhere in the mix: this matches a much larger set of microstates than the layered case.

      It's mostly molecular translation and rotation (hurried up by the stirring) that causes the layered milk-on-coffee to homogenize and come into thermal equilibrium.

      But if we take the perfect crystal case, any microscopic rotation or translation within the crystal makes the crystal LESS uniform but also dramatically increases the crystal's entropy.

      If we smash the crystal into dust the entropy will become much higher, but the dust will be arranged much less uniformly than when the crystal was intact.

      > entropy is a measure of order

      Yes.

      > or how mixed things are

      Maybe.

      > Entropy can also be understood as uniformity

      Sometimes. There are lots of things in nature which are highly uniform but have very low entropy, with entropy increasing as they become less uniform.

      • graycat 2 days ago

        As I recall from work in measure and probability, there is the Poincaré recurrence theorem that says if keep stirring the coffee and cream, eventually the mixture will return as closely as please to just after the cream was poured into the coffee.

      • AndrewKemendo 2 days ago

        >The state of unknown information is 23 bits per molecule

        This begs the following question: How many bits per molecule is known, or assumably measurable?

        What format is this “known” information encoded into or otherwise measured?

        Fascinating

        • fregus 2 days ago

          I just love John Baez stuff, hitting the perfect balance between physics and math.