Thermodynamics/Entropy and Life

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
We know that animals break down complex carbohydrates and proteins to simpler less complex molecules via digestion and then re-organize those same components into again more complex molecules and structures via Protein/Enzyme etc.. via translation of mRNA to Proteins within the cells of the animal. From what I understand of the laws of Entropy the animal has to take in significantly more energy/mass from plants to form a more complex system (the animal body) being that the animal body must have less entropy overall that the plants that provided it. Thus more plants each having less entropy than the animal body are required to create a more complex system that is more complex than any given plant in the system. This being required as the overall entropy must always increase.
The quandary for me is how does this end up applying to humans? The human having a significantly more advanced brain than any other mammal on earth seems to somehow contradict this law. As we have built highly complex machines and even AI that appear to have significantly less entropy than the system that gave rise to it. That is the human body/brain system. It appears that the overall entropy of the earth is somehow going down defying the laws of thermodynamics.
In fact the Internet has allowed multiple humans to work on some of the most complex problems imaginable and create new AI systems with significantly less entropy than even the brain organization of a human being. It appears, at least to me, that somehow the second law of thermodynamics is somehow being defied by human intellect and especially when the human intellect is combined by orders of magnitude via human brains working together on AI and other complex problems.
This appears to me to somehow violate the second law of thermodynamics. I am just curious the opinion of others on the forum on how this is possible. Or maybe it is not really happening but only appears that way on the surface.
What is the forums opinion on this matter?
 

Ya’akov

Joined Jan 27, 2019
9,069
On average, entropy must increase. Local order is not a violation of thermodynamics. The cost of creating this order is also an increase in entropy as we take the very low entropy energy from the sun (and from stored low entropy in the form of hydrocarbons, etc.) and we use the energy, at a loss, to reorder things to our liking.

The result is a lot of high entropy heat radiated away. The cost of order is generating entropy along with it and eventually we will run out of the low entropy sources and have heat everywhere preventing more order from being possible.
 

ZCochran98

Joined Jul 24, 2018
303
Keep in mind: all of this DOES take work to do. Any synapse firing, any computation performing, all takes some energy draw, and the amount of energy it consumes generally corresponds to the "complexity" of the problem. And on all of this, not all the energy is expended usefully, either; much of it is still expended as heat, and the systems themselves are breaking down over time. So the overall entropy is still increasing; it's just that we're able to glean results we deem useful from this process. No information is being created, none destroyed; it's all just being "rearranged."
How we define entropy is a fuzzy thing, too, for the record. Commonly understood to be "chaos" or a measure of "disorder" in a system, entropy is better understood as the measure of the unavailability of a system's energy to "do work." The higher the entropy, the less of the energy present is actually useful.
 

xox

Joined Sep 8, 2017
838
From what I understand of the laws of Entropy the animal has to take in significantly more energy/mass from plants to form a more complex system (the animal body) being that the animal body must have less entropy overall that the plants that provided it. Thus more plants each having less entropy than the animal body are required to create a more complex system that is more complex than any given plant in the system. This being required as the overall entropy must always increase.

The quandary for me is how does this end up applying to humans? The human having a significantly more advanced brain than any other mammal on earth seems to somehow contradict this law. As we have built highly complex machines and even AI that appear to have significantly less entropy than the system that gave rise to it. That is the human body/brain system. It appears that the overall entropy of the earth is somehow going down defying the laws of thermodynamics.

In fact the Internet has allowed multiple humans to work on some of the most complex problems imaginable and create new AI systems with significantly less entropy than even the brain organization of a human being. It appears, at least to me, that somehow the second law of thermodynamics is somehow being defied by human intellect and especially when the human intellect is combined by orders of magnitude via human brains working together on AI and other complex problems.

This appears to me to somehow violate the second law of thermodynamics. I am just curious the opinion of others on the forum on how this is possible. Or maybe it is not really happening but only appears that way on the surface.

What is the forums opinion on this matter?

To be correct there is no such "requirement" that entropy must always increase. We only say that it "tends" to do so simply because more microstates map to equilibrium than they do to other, lower entropy states. That might seem like a somewhat pedantic point but it is a very important thing to understand. Flip a coin 100 times and ON AVERAGE roughly 50% will come up "heads". But on ANY given run you might get say 75% heads. That doesn't mean you've violated the laws of probability, just that you landed on an unlikely (albeit perfectly valid) configuration.

With respect to living systems, staying alive has less to do with complexity than it does say maintaining the ability to create differentials in pressure and whatnot. Which of course is the opposite of a state of equilibrium. So our systems are basically fighting against entropy in order to survive!

Entropy as it relates to information content (ie: complexity) is another matter altogether. Things which are compressed (a strand of MRNA, a .zip file, etc) represent high-entropy. But unlike the matter of "heat-entropy" this kind is more abstract. For example, consider a text file containing the the following data: "e=mc^2". In one sense this represents very low-entropy content, just six bytes of data. And yet in another, very real way it "compresses" a large corpus of information within it. (The popularized version of Einstein's famous equation).

The point is, the implications are generally quite different when we discuss physical versus mathematical entropy.
 

Delta Prime

Joined Nov 15, 2019
1,311
In the entire field of physics there is no concept that is more difficult for me at least to understand then enthropy & at the same time it is one concept that is the most fundamental.
Humans are engaged in a constant battle to maintain themselves against the forces of enthropy. Cellular decay for now at least is irreversible. They do not decay symmetrically with respect to time. They escape the strictures of(but do not violate)the second law of thermal dynamics which is based on reversibility in the underlying microscopic dynamics.
For that reason alone entropy, cellular decay can decrease & order can spontaneously appear out of disorder. I find this stuff fascinating as well! But I'm just a youngster chronologically speaking.
 

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
To be correct there is no such "requirement" that entropy must always increase. We only say that it "tends" to do so simply because more microstates map to equilibrium than they do to other, lower entropy states. That might seem like a somewhat pedantic point but it is a very important thing to understand. Flip a coin 100 times and ON AVERAGE roughly 50% will come up "heads". But on ANY given run you might get say 75% heads. That doesn't mean you've violated the laws of probability, just that you landed on an unlikely (albeit perfectly valid) configuration.

With respect to living systems, staying alive has less to do with complexity than it does say maintaining the ability to create differentials in pressure and whatnot. Which of course is the opposite of a state of equilibrium. So our systems are basically fighting against entropy in order to survive!

Entropy as it relates to information content (ie: complexity) is another matter altogether. Things which are compressed (a strand of MRNA, a .zip file, etc) represent high-entropy. But unlike the matter of "heat-entropy" this kind is more abstract. For example, consider a text file containing the the following data: "e=mc^2". In one sense this represents very low-entropy content, just six bytes of data. And yet in another, very real way it "compresses" a large corpus of information within it. (The popularized version of Einstein's famous equation).

The point is, the implications are generally quite different when we discuss physical versus mathematical entropy.
There is approximately a 1 in 3.5 Million chance of getting 75 or more heads during a 100 coin flip. There is a 54% probability of getting 50 or more heads in a 100 coin flip. Granted you 'might' get 75% heads or more but the chances of it are astonishingly low. In the long term you will always get 50/50. Entropy does not refer to single incidence occurrences but what will the result be over the long run. Now you did sort of state that, I just wanted to bring in the probabilities. Looking this up on the internet, I was astonished how complex it is to determine coin flip probabilities like this. The combinatorics get very complex very fast.
Now flip that coin say 10,000 times. Now what is the probability of getting 75% or more heads? A number that is so small that is simply will never happen even if you could flip those coins at 1 million flips per second for the age of the universe!
 
Last edited:

MrSalts

Joined Apr 2, 2020
2,767
Entropy is a type of energy. Types of energy can be interconverted according to the equations below. The enthalpy (heat or energy to do work) can be used to add order to a chaotic system. Even ice forming as heat is removed from water results in a decrease in entropy - a Crystal lattice is a form of organization and a lower state of entropy than liquid which is lower than a vapor. And, no, Gibbs Free Energy has nothing to do with pseudo-science. It is simply the energy equal to the maximum reversible work that can be done at constant temp and pressure. Usually associated with chemical or biochemical reactions but it also applies to physical changes in state or simply heating or cooling matter and associated changes in dew points, vapor pressures, etc.
An organism can use the energy from food to create ordered systems. It is a completely false assumption that plants have less entropy (more order) than an animal just because the plant is upstream in the food chain - there is no chain, it is a carbon cycle (the sun + photosynthesis converts the CO2 and H2O to carbohydrates including cellulose).

I'd like to see a reference on the quote below. I'm always looking for a "journal" to unseat Scientific American as the most useless and unscientific journal - please let me know where you found it. Life of an organism is not a spontaneous process. Releasing gas pressure in a cylinder to the atmosphere is a spontaneous process. Look up 2nd law of Thermodynamics for context.
...being that the animal body must have less entropy overall that the plants that provided it.
Finally, this quote is none sense.
From what I understand of the laws of Entropy
There are no "laws of Entropy" there are laws of thermodynamics. "Entropy x temperature" is one form of energy.

1644909932034.jpeg
 
Last edited:

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
Entropy is a type of energy. Types of energy can be interconverted according to the equations below. The enthalpy (heat or energy to do work) can be used to add order to a chaotic system. Even ice forming as heat is removed from water results in a decrease in entropy - a Crystal lattice is a form of organization and a lower state of entropy than liquid which is lower than a vapor. And, no, Gibbs Free Energy has nothing to do with pseudo-science. It is simply the energy equal to the maximum reversible work that can be done at constant temp and pressure. Usually associated with chemical or biochemical reactions but it also applies to physical changes in state or simply heating or cooling matter and associated changes in dew points, vapor pressures, etc.
An organism can use the energy from food to create ordered systems. It is a completely false assumption that plants have less entropy (more order) than an animal just because the plant is upstream in the food chain - there is no chain, it is a carbon cycle (the sun + photosynthesis converts the CO2 and H2O to carbohydrates including cellulose).

I'd like to see a reference on the quote below. I'm always looking for a "journal" to unseat Scientific American as the most useless and unscientific journal - please let me know where you found it. Life of an organism is not a spontaneous process. Releasing gas pressure in a cylinder to the atmosphere is a spontaneous process. Look up 2nd law of Thermodynamics for context.


Finally, this quote is none sense.

There are no "laws of Entropy" there are laws of thermodynamics. "Entropy x temperature" is one form of energy.

View attachment 260709
You brought up some good points! I am no expert in Thermodynamics. There is no reference to that quote. It is simply my opinion on the matter. On the second item, you are correct. Though at the same time I have found multiple hits on the web referring to it as the 'Law Of Entropy'. Such as the following link:

https://smartenergyeducation.com/law-of-entropy/

I should have stated the Laws Of Thermodynamics being that Entropy is a measurement of the system. It is interesting how many 'different' definitions for entropy seem to come up on the Internet, the following one was from "Oxford Languages":

1644931672386.png
 

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
Entropy is a type of energy. Types of energy can be interconverted according to the equations below. The enthalpy (heat or energy to do work) can be used to add order to a chaotic system. Even ice forming as heat is removed from water results in a decrease in entropy - a Crystal lattice is a form of organization and a lower state of entropy than liquid which is lower than a vapor. And, no, Gibbs Free Energy has nothing to do with pseudo-science. It is simply the energy equal to the maximum reversible work that can be done at constant temp and pressure. Usually associated with chemical or biochemical reactions but it also applies to physical changes in state or simply heating or cooling matter and associated changes in dew points, vapor pressures, etc.
An organism can use the energy from food to create ordered systems. It is a completely false assumption that plants have less entropy (more order) than an animal just because the plant is upstream in the food chain - there is no chain, it is a carbon cycle (the sun + photosynthesis converts the CO2 and H2O to carbohydrates including cellulose).

I'd like to see a reference on the quote below. I'm always looking for a "journal" to unseat Scientific American as the most useless and unscientific journal - please let me know where you found it. Life of an organism is not a spontaneous process. Releasing gas pressure in a cylinder to the atmosphere is a spontaneous process. Look up 2nd law of Thermodynamics for context.


Finally, this quote is none sense.

There are no "laws of Entropy" there are laws of thermodynamics. "Entropy x temperature" is one form of energy.

View attachment 260709
Thanks for bringing up some excellent points! I stand corrected on much of what I said. Thanks for sharing your opinion on the matter.
 

Attachments

xox

Joined Sep 8, 2017
838
I should have stated the Laws Of Thermodynamics being that Entropy is a measurement of the system. It is interesting how many 'different' definitions for entropy seem to come up on the Internet, the following one was from "Oxford Languages":

View attachment 260722
The two definitions are simply stating the same thing, just from different perspectives. Total disorder implies a system in equilibrium. Which means no work can be performed because no significant pressure differentials are present within the system.

Again there is also entropy as it relates to information theory. And therein lies an interesting connection. A low-entropy crystalline structure for example can be described with much less information than say a boiling pot of water. Which is to say that if you had to somehow reconstruct the two, the crystal would be much easier to describe mathematically.

It turns out that the complexity of a system is intrinsically tied to probability theory. People such as Claude Shannon saw this connection and from that developed some of the techniques that we use to this very today, to compress data, measure the security of passwords, quantify the information content of the data stored in data-centers, etc.

To think that it all started from the study of heat!
 

Thread Starter

dcbingaman

Joined Jun 30, 2021
1,065
Entropy is a type of energy. Types of energy can be interconverted according to the equations below. The enthalpy (heat or energy to do work) can be used to add order to a chaotic system. Even ice forming as heat is removed from water results in a decrease in entropy - a Crystal lattice is a form of organization and a lower state of entropy than liquid which is lower than a vapor. And, no, Gibbs Free Energy has nothing to do with pseudo-science. It is simply the energy equal to the maximum reversible work that can be done at constant temp and pressure. Usually associated with chemical or biochemical reactions but it also applies to physical changes in state or simply heating or cooling matter and associated changes in dew points, vapor pressures, etc.
An organism can use the energy from food to create ordered systems. It is a completely false assumption that plants have less entropy (more order) than an animal just because the plant is upstream in the food chain - there is no chain, it is a carbon cycle (the sun + photosynthesis converts the CO2 and H2O to carbohydrates including cellulose).

I'd like to see a reference on the quote below. I'm always looking for a "journal" to unseat Scientific American as the most useless and unscientific journal - please let me know where you found it. Life of an organism is not a spontaneous process. Releasing gas pressure in a cylinder to the atmosphere is a spontaneous process. Look up 2nd law of Thermodynamics for context.


Finally, this quote is none sense.

There are no "laws of Entropy" there are laws of thermodynamics. "Entropy x temperature" is one form of energy.

View attachment 260709
Correct me if I have this wrong:
Delta S is the change in entropy measured in J/K.
Thus T*Delta(S) would be a change in Energy measured in Joules
Delta H being the change in 'Enthalpy' that I just looked up being the sum of internal energy plus the product of Pressure and Volume, which again must be in Joules.
Results in a change in Gibbs free energy defined 'nicely' here:

https://en.wikipedia.org/wiki/Gibbs_free_energy

Finally Gibbs free energy being the change in free energy in the system.
I have seen this formula before but never knew exactly what the units were nor how to apply it to the practical world.

Can someone show a practical example of using this formula? At the moment even after everything I read it still seems nebulous to me.
 
Last edited:

MrSalts

Joined Apr 2, 2020
2,767
You brought up some good points! I am no expert in Thermodynamics. There is no reference to that quote. It is simply my opinion on the matter. On the second item, you are correct. Though at the same time I have found multiple hits on the web referring to it as the 'Law Of Entropy'. Such as the following link:

https://smartenergyeducation.com/law-of-entropy/

I should have stated the Laws Of Thermodynamics being that Entropy is a measurement of the system. It is interesting how many 'different' definitions for entropy seem to come up on the Internet, the following one was from "Oxford Languages":

View attachment 260722
The quote from Oxford Language about entropy is missing the key word, universe. It should read, "the second law of thermodynamics says that entropy of a universe always increases with time". If I remember correctly, a "universe" can be any closed system that does not allow energy to be supplied from outside the "universe".
 

MrSalts

Joined Apr 2, 2020
2,767
Can someone show a practical example of using this formula? At the moment even after everything I read it still seems nebulous to me
A practice example is determining if a process is spontaneous (Negative Gibbs Free Energy) or if it requires energy input to occur (positive Gibbs Free Energy).

A well-studied example in chemical engineering is the production of ammonia (NH3) from the elements (H2 and N2). A seemingly impossible reaction in practice but the Gibbs free energy indicated it should be spontaneous. However, the equation only determines spontaneousness as total energy balance. It does not account for "activation energy". Catalysts lower activation energy and can allow a difficult reaction like the ammonia reaction to proceed as expected.

The book, Alchemy of Air is a great book about the development of the ammonia process and commercialization (war, nazis, Jewish scientist, a huge industrial explosion, creative metallurgy, and the bat guano shortage that started the global ammonia technology race are all well covered).

In the linked example, you'll see that the pressure (via ideal gas law) decreases by half from 4 moles of raw materials (the mixture of one mole N2, and three moles H2) to finished product of 2 moles of NH3.
So, as requested, here is the Gibbs Free energy example...
https://www.omnicalculator.com/chemistry/gibbs-energy

you can carry the example forward to determine the temperature where the reaction is thermodynamically "non-spontaneous".
 
Last edited:

xox

Joined Sep 8, 2017
838
I respectfully disagree. Theoretically the best we can do is have an isentropic process. In practice a non-lossless process requires an increase in entropy.
You seem to be confusing "what is possible" with "what is observed". Entropy does indeed TEND to increase, but that is not to say that it CANNOT decrease.

Here is a simple example using coin tosses. Flip a coin four times and record the result. Now there are sixteen possible outcomes, listed below along with the number of heads in each configuration.

HHHH : 4
HHHT : 3
HHTH : 3
HHTT : 2
HTHH : 3
HTHT : 2
HTTH : 2
HTTT : 1
THHH : 3
THHT : 2
THTH : 2
THTT : 1
TTHH : 2
TTHT : 1
TTTH : 1
TTTT : 0
So there is a single configuration each for the cases where either zero or four H's are encountered, four consisting of either one or three H's, and SIX where two heads are present. Each individual configuration is equally valid, but all-heads and all-tails are obviously the RAREST outcomes. The most likely outcome of course is 50/50 and so ON AVERAGE that is what is generally observed.

Well the same goes for 10^6 coin tosses. You may get one million heads in a row. Granted it is incredibly unlikely, but certainly NOT IMPOSSIBLE. Entropy works in the same way. The only reason one corner of a running engine doesn't develop an ice-cold spot is simply because the chances are EXTREMELY low. But it COULD happen! That was my only point really.
 

MrSalts

Joined Apr 2, 2020
2,767
The only reason one corner of a running engine doesn't develop an ice-cold spot is simply because the chances are EXTREMELY low. But it COULD happen! That was my only point really.
our high school science teachers are doing society a dis-service by oversimplifying certain topics. Thermodynamics and the disordervs order of entropy is completely predictable and measurable and it is absurd to think that a section of an internal combustion engine could randomly stay cold via statistics is an example of entropy.

A better explanation of entropy would be combustion of liquid octane in oxygen.
2 C8H18 + 25 O2 => 18 H2O + 16 CO2
You can see that entropy increases in this case because the liquid becomes gas (essentially converting 25 moles of gas to 34 moles of gas.

While burning one mole of molten sodium metal in a chamber containing 0.5 moles of chlorine (Cl2) yields micro crystals (but still ordered crystals) of sodium chloride. Liquid and gas to an ordered crystalline solid create a yellow-hot flame in the cup of sodium but the result is very low entropy. The size and distribution of the microcrystalline spread inside the chamber are a minute amount of measurable entropy vs the conversion of 3.01 x 10^23 gas molecules converted to a crystalline solids. Note: this reaction can create a surprisingly strong vacuum inside the chamber as it proceeds if the cup of sodium metal is kept above the melting point (120°C) long enough for most sodium and chlorine gas to react out.

pin both examples above, statistics has little impact on the measurable disorder/order of entropy.
 
Top