Energy, Entropy and Information

The concept of entropy was born in the second half of the 19th century in a new scientific field, called thermodynamics, which aimed to describe changes in the form of another new physical quantity called energy. Entropy was introduced to account for some of their irreversible features.

Energy itself is a difficult concept that has no definition, except that given by a conservation principle (see R. Feynmann ). It is therefore not surprising that a notion derived from it is itself obscure. But there, the confusion is even greater, a threshold is crossed : entropy is in fact linked to a subjective and anthropocentric quantity, namely the quantity of information that we lack to completely represent the system under consideration.

This introduction of subjectivity into “hard sciences” is already amazing. Added to this is the fact that the concept of entropy is invoked in practically all scientific fields: from cosmology (“the entropy of the universe tends to a maximum”, R. Clausius ) to life science (“A living organism continually increases its entropy and thus tends to approach the dangerous state of maximum entropy, which is death”, E. Schrödinger ) passing by data compression and coding and computer science .

Recently (since the Covid-19 parenthesis), I put my two cents in this matter.

The fundamental difference between Boolean logic and thermodynamic irreversibilities, or, why Landauer's result cannot be a physical principle

Translations: FR
Landauer’s “principle” claims that erasing one bit of information necessarily dissipates at least Tln2 of heat into the surroundings, making a possibly logically irreversible Boolean operation also thermodynamically irreversible. It is commonly accepted that this result is a fundamental principle of physics that definitively establishes the link between information and energy. Here, we show that this result cannot be general. In fact it comes (1) from a confusion between logical and thermodynamic irreversibilities and between logical and thermodynamic states, which is reminiscent of the classic Gibbs paradox about the joining of two volumes of the same gas, and (2) from two unnecessary constraints imposed on the erase procedure.

Thermostatistics, information, subjectivity, why is this association so disturbing?

Translations: FR
Although information theory resolves inconsistencies (known under the form of famous enigmas) of the traditional approach of thermostatistics, its place in the corresponding literature is not what it deserves. This is interpreted as being mainly due to epistemological rather than scientific reasons: the subjectivity introduced into physics is perceived as a problem. This paper attempts to expose and clarify where exactly this subjectivity lies: in the representation of the reality and in probabilistic inference.

On the supposed mass of entropy and that of information

Translations: FR
In the theory of special relativity, energy can be found in two forms: kinetic energy and rest mass. Potential energy of a body is actually stored under the form of rest mass, interaction energy too, temperature is not. Information acquired about a dynamical system can be potentially used to extract useful work from it. Hence the “mass-energy-information equivalence principle” that has been recently proposed. In this paper, it is first recalled that for a thermodynamic system made of non interacting entities at constant temperature, the internal energy is also constant.

Thermodynamical versus logical irreversibility : a concrete objection to Landauer's principle

Translations: FR
Landauer’s principle states that the logical irreversibility of an operation, such as erasing one bit, whatever its physical implementation, necessarily implies its thermodynamical irreversibility. In this paper, a very simple counterexample of physical implementation (that uses a two-to-one relation between logic and thermodynamic states) is given that allows to erase one bit in a thermodynamical quasistatic manner (i.e. that may tend to be reversible if slowed down enough). Two-to-one implementation of a bit inspired by my practice of bicycles: the two bit-values correspond to one single thermodynamic state.

Plea for the use of the exact Stirling formula in statistical mechanics

Translations: FR
In statistical mechanics, the generally called Stirling approximation for N! is actually an approximation of Stirling’s formula. In this article, it is shown that the term that is dropped is in fact the one that takes fluctuations into account. The use of the Stirling’s exact formula forces us to reintroduce them into the already proposed solutions of well-know puzzles such as the extensivity paradox or the Gibbs’ paradox of joining two volumes of identical gas.

A short derivation of Boltzmann distribution and Gibbs entropy formula from the fundamental postulate

Translations: FR
Introducing the Boltzmann distribution very early in a statistical thermodynamics course (in the spirit of Feynmann) has many didactic advantages, in particular that of easily deriving the Gibbs entropy formula. In this note, a short derivation is proposed from the fundamental postulate of statistical mechanics and basics calculations accessible to undergraduate students. Download: PDF

What entropy really is : the contribution of information theory

Translations: FR
Even today, the concept of entropy is perceived by many as quite obscure. The main difficulty is analyzed as being fundamentally due to the subjectivity and anthropocentrism of the concept that prevent us to have a sufficient distance to embrace it. However, it is pointed out that the lack of coherence of certain presentations or certain preconceived ideas do not help. They are of three kinds : 1. axiomatic thermodynamics; 2.