Sujet : Re: 2nd law clarifications
De : me22over7 (at) *nospam* gmail.com (MarkE)
Groupes : talk.originsDate : 05. Jan 2025, 06:51:02
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vld6k5$t7a6$1@dont-email.me>
References : 1 2 3 4 5 6
User-Agent : Mozilla Thunderbird
On 5/01/2025 3:47 am, Rufus Ruffian wrote:
MarkE wrote:
On 4/01/2025 12:38 am, Rufus Ruffian wrote:
Again, how many joules per kelvin are consumed by the loss of
"information"?
>
Does this necessarily mean entropy will increase? It would seem so.
>
No. Entropy increases because that's what entropy does. It doesn't care
than remarkable life forms are constructed along the way.
>
Universally, of course. Locally, not necessarily. Would you agree that
evolution produces a local decrease in entropy?
No, because it doesn't. My whole point was that you (like most
creationists) fundamentally and perhaps deliberately misunderstand the
whole 2nd law concept. Apparently the point whooshed you.
So again, how many joules per kelvin are consumed by the loss (or gain)
of "information"?
At least 2.9×10−21 joules per bit.
That is, if Landauer's principle is applicable*, "which states that the minimum energy needed to erase one bit of information is proportional to the temperature at which the system is operating. Specifically, the energy needed for this computational task is given by
E ≥ kB.T.ln(2)
where
kB is the Boltzmann constant and
T is the temperature in Kelvin"
https://en.wikipedia.org/wiki/Landauer%27s_principle* Not certain if/how; nevertheless I am intrigued by the relationship between information and energy quantified by Landauer's principle.
This topic is difficult and sometimes misunderstood by both sides IMO, unintentionally or otherwise. My own understanding is incomplete - happy for an open discussion.
RELATIONSHIP TO PREBIOTIC CHEMISTRY
"Essentially, to remain consistent with the second law of thermodynamics, self organizing systems that are characterized by lower entropy values than equilibrium must dissipate energy so as to increase entropy in the external environment.[32] One consequence of this is that low entropy or high chemical potential chemical intermediates cannot build up to very high levels if the reaction leading to their formation is not coupled to another chemical reaction that releases energy. These reactions often take the form of redox couples, which must have been provided by the environment at the time of the origin of life."
https://en.wikipedia.org/wiki/Entropy_and_lifeThat is, an ensemble of organised pre/proto-life molecules (e.g. a protocell) are localised low entropy. To be sure, the surrounding environment pays the price for this with an even greater increase in entropy.
CONFIGURATION ENTROPY
"Configuration entropy can be calculated using the Boltzmann entropy equation:
𝑆 = 𝑘𝐵.ln(𝑊)
where:
𝑆 is Entropy
𝑘𝐵 is the Boltzmann constant
𝑊 is the Number of possible microstates (configurations)
consistent with the macrostate
Microstates and Macrostates:
Microstate: A specific arrangement of the system's components.
Macrostate: The overall state of the system, described by observable properties like temperature, pressure, or composition.
A macrostate with a larger number of possible microstates has higher configuration entropy." (ChatGPT-4o; also
https://en.wikipedia.org/wiki/Configuration_entropy)
For an ensemble of molecules that could form a living entity, the "nonliving macrostate" is composed of many more possible microstates (Wn) than the "living macrostate" (Wl). E.g., if say Wn/Wl > 10^10, then the local configuration entropy *decrease* to go from nonliving to living is
Sn - Sl = kB(ln(Wn) - ln(Wl))
= kB.ln(Wn/Wl)
= kB.ln(10^10)
= 3.2E-22 joules per kelvin
Which incidentally could be used to show improbability of the spontaneous formation of life (not suggesting that is being claimed).