Anthropocene or entropocene? (en)

“Anthropy” and “entropy”, it’s the same pronunciation in French, a bit different in English but it remains similar [8]. One relates to Man (Homo Sapiens), the other designates a thermodynamic quantity. No connection? Indeed, until we had to name the new geological era [26] that we, humans, are slowly creating (pollution, destruction of living conditions, etc.).

This article is a translation of this one, originally in French.

I. Deregulatocene, capitalocene and others
II 1. Entropy, a measure of disorder?
II 2. The… negentropy?
II 3. Entropy vs complexity
III 1. Entropy: to absolutely avoid?
III 2. Can we “moderately consume” entropy?
IV. The entropocene

I. Deregulatocene, capitalocene or others

Its definition therefore varies widely, as does its start date: some scientists are therefore reluctant to baptize this new era [27]. Especially since it will simply be too short compared to the current definition [10]: geologically, the duration of Homo Sapiens is of a few thousand centuries [28], and its self-destruction in just a few centuries could be more perceived as a “crisis” between two eras [10].
The suffix –cene is therefore debatable, but the definition of this new era seems to have already made a break through in many academic circles [30]. Outside the natural sciences, it is rather “anthropos” (“man” in Greek) which is the subject of debate: are the damages, inflicted to the Earth system, really attributable in an equal manner to the populations of the countries of the South, to the “losers” of globalization, to the victims (past and future) of imperialism, or even to the indigenous peoples or hunter-gatherers still existing in various latitudes? [11, 29]

The “Capitalocene” was proposed as a replacement [12], by opposing capitalism to the tribal, communal and then feudal systems, but the communist system without capital accumulation has been proven to be just as destructive of nature.
So what ? The “Megalomaniacocene” [11], the age when the madness of a few was imposed to the majority? Or the “Deregulatocene” the moment when “the capacities for the progressive acquisition of a particular species were no longer regulated by interactions of this species with the whole biosphere”, perhaps? [21] This approach suggests not to accuse mental structures, “abstractions” (like capitalism), because they would just be some mean to discard our responsibility, and would obscure important evolutionary mechanisms. However, this vision obscures the fact that certain human behaviors can be more harmful than others environmentally speaking (“if everyone is guilty, no one is guilty”).
Finally, against or in addition to the anthropocene, the “entropocene” has been proposed. But to detail its strengths and weaknesses, we must first explain what is entropy.


II.1. Entropy: a measure of disorder?

(Left) Hijack of the Monopoly game (internet) with the formulation of C.P. Snow on thermodynamics. (Right) More detailed explanation of each point (adapted from [16]).

Entropy could even be a measure of uncertainty, of ignorance, of the unpredictable [20]. The ways of arranging particles in a physical system have also be used (by analogy) in information theory, and it was then possible to define the entropy of a binary dataset. However, even if entropy has been extended to domains other than thermodynamics throughout history, a “semantic cleaning” seems to less and less associate it with “disorder” (thus narrowing the definition). We could then define this quantity as what prevents a system from changing (or transforming) spontaneously [46] (see Peter Atkins).

II.2. Negentropy ?

II.3. Entropy vs complexity

This is also true for thermodynamics: fully mixed cappuccino is the most messy and entropic, but not the most complex (because there is milk and coffee at every place). The most complex is the intermediate case, where the turbulence of the mixing process creates complex, fractal structures [22], that are rather pretty (see below).

MinutePhysics — Where Does Complexity Come From? (Big Picture Ep. 3/5) — YouTube

The information required to describe the system {cappuccino} grows hand-in-hand with entropy: very simple at the start (bottom: coffee, top: milk), larger in the middle (coffee in islets of milk, and vice versa …), and becomes maximum at the end of the mixing (no recognizable redundancy, just coffee-milk).
The information, even if it has a negative sign in the definition of Shannon, is indeed a real entropy in the thermodynamic sense: it is the result of increased disorder. The problem comes from the fact that we don’t necessarily have a good representation of “disorder”. To increase the disorder of a system (therefore its information) can also mean to make it useful and intelligible for us, to assemble things between them to give birth to other properties, by emergence. So the information seems to be a regression of entropy (a “negentropy”) only when it becomes non-random, when it becomes something complex enough to be useful, and with beauty. At least for us humans. A machine with infinite computing power (like the “Maxwell demon”) could probably apprehend the universe by measuring the state of each particle which compose it, but we are forced to find laws, mathematical and physical formulas to condense information as much as possible and make it intelligible. So, we increase the complexity, it makes sense for us, and we find it beautiful [40].

However, we are not looking for maximum complexity, but a complexity that can be described as “optimal”: we first collect data (e.g. positions of the planets), then we identify patterns, regularities which can be translated into physical formulas (ex: Kepler’s formulas), and finally we can eventually deduce a general law (ex: Newton’s universal gravity) [33]. During this process, we greatly reduce the information (in the physical sense, therefore entropy), and we reach a small degree of complexity. But these general laws are hardly applied as such in practice: in the case of Newton for example (2nd law of dynamics), it is often preferred to use a more complex but more practical formula for each particular case (Bernoulli’s law for certain fluids for example), slightly increasing the complexity. To come to conceptualize these beautiful formulas (and therefore to reduce entropy in the sense of information), you also need an external energy supply.
Information (in the human sense) is then more entropic than a uniform state (of zero entropy, without information), but not too much (to remain intelligible, it must not tend towards too much disorder). Negentropy therefore seems to have its baseline to the state of maximum entropy, and not to the state of zero entropy.
To further understand that “information (in the physical sense) = entropy”, we can see that the information required to describe the Universe is growing since the Big Bang (where it was condensed in a “point”, so zero information described it). Thus, information-entropy is indeed growing. But the information (human-readable) may decrease if the maximum complexity has been reached. Mathematically, this complexity is the one of Kolmogorov (or Solomonoff [19]), it is a purely formula definition, but which fits rather well when it comes to rigorously defining what is defined by “complexity” **.
In summary, information in the human sense increases if complex structures are assembled from uniform, or on the contrary from random systems. However, in the second case, the information in the physical sense (and therefore the entropy) decreased (see table below). This conceptual contradiction comes from the fact that we have lots of definitions of “information”, sometimes contradictory (see [22] p. 280), while information in the physical sense (by Shannon) is well-defined.

increase or decrease of useful information when entropy increases
increase or decrease of useful information when entropy increases

Negentropy is also a rather vague concept, because it does not have the same meaning if we consider information in a purely physical or human sense. We will summarize it as follows: an external energy supply can be used to order things so that new properties emerge (e.g. life), and we can call this new information “negentropy”. Abusely, the energy supply may be called negentropy. This information may have increased the entropy of the system (creation of pure physical information), or may have decreased it (in this case entropy is still created outside the system in the form of thermodynamic dissipation), see table above.

III.1. Entropy: to avoid at all costs?

Digital systems, which are a priori “negentropists”, therefore multiply (useful) information by rejecting entropy in another form: CO2, and a dispersion of materials especially metals (these materials are present in very small quantities in the devices). So, contrary to the idea that ICT will be “the solution to the environmental problems that our industrial era has imposed to our planet” [5], it must be understood that — even if the source of energy (the Sun) is external to the Earth system — the entropy resulting from the process of information creation remains in the system (in the form of pollution, CO2 + material dispersion). In addition, ICT is a facilitator of the economy, thus it also facilitates CO2 emissions [6], which are very stable and simple molecules, dispersed, and without preferential direction (it’s a gas), therefore quite entropic (and “anthropic” for some, here’s the problem!). We are unable to evacuate the (thermodynamical) entropy of the digital world, yet creator of useful information for humans (thus potentially increasing local order). More broadly, we cannot eliminate the entropy excess that we create in the form of CO2, or of heat

Because it is certain that we would give little importance to increase the entropy outside our planet, even exponentially, as long as our Earth system (the only suitable place for our species, which has evolved there for millions of years) does not degrade. We create a lot of “entropy surplus” because it involves a transformation and dispersion of materials (fossil resources, ores) that is almost instantaneous compared to their formation time (10 million years), and compared to their total re-transformation time (e.g. up to 10,000 years for CO2).
We may have an evolving bug: with the mastery of fire, we previously used to reject entropy mainly in the form of heat (and a compensable amount of CO2), from an external source (the Sun), which could be easily compensated by the ecosystems. But since the fossil fuel era, the release of entropy is too fast, and we are not accustom (evolutionarily) that our environment cannot keep pace to eliminate it.

We should discuss that entropy also defines another important principle: irreversibility. In general, the larger the systems (macroscopic), the less they are reversible, and therefore the more they create entropy. To increase one’s translation speed, you need a machine (human body, vehicle) whose energy consumption increases with size, therefore that produces more and more entropy: friction in a fluid (dissipation in the form of heat) thus evolves first proportional to the speed, then to its square (power two), if it exceeds a certain threshold (called “turbulence”). We can therefore think that limiting the creation of entropy is to be more sustainable, to have less losses. Photosynthesis is for example a quasi-reversible process, which minimizes losses.
But irreversibility is still necessary, because it defines the “arrow of time” [22] (the fact that the processes take place in one direction and not in the other). A process is irreversible because it produces entropy, and this enables possible future states of the Universe *** (because information in the physical sense -and uncertainty- increase): it thus avoids absolute determinism of any subsequent state [31, 45]****. We may have Free Will thanks to entropy (i.e. an ability to act and think freely and independently).
In a more philosophical view, one could say that one aspect of the entropocene is precisely an increase in Free Will, by a great expenditure of useful energy: to exercise ever more your Free Will, like buying a big polluting car, fly to Cancun every year, eat what you want all the time, etc., you always have to create more entropy. Once again, we must be careful not to abuse it: entropy first creates irreversibility in the form of physical information, which allows Free Will, and multiplies the possible futures, but beyond a certain threshold, the interesting futures for the human species become rare. This is because the unpredictable and the physical uncertainty become too high (the quality of the information having been deteriorated).

III.2. Can we consume entropy sparingly? (the maximum entropy production principle)

(left) Energy dissipation as a function of the evolution of structures (Eric Chaisson), showing the increasing degree of production of entropy. (right) Entropy production during the lifetime of a living being (hypothesis). Fig8.1 of [39].

But this principle actually applies to systems far from the equilibrium state [36]. In times of abundance, where energetic flows are possible, we tend to maximize the rejection of entropy and the energy dissipation. When the flows are blocked, or close to equilibrium (shortages), we rather minimize (see right curve above) [39]. Equivalently, minimizing energy expenditure close to equilibrium is equivalent to maximizing dissipation fluxes very far from equilibrium [34]. A process would thus always dissipate a maximum of energy according to its capacities, and of the energy which it has at its disposal (taking again the preceding example, the turbulent flow of a fluid dissipates therefore more than a laminar flow because “it can afford it” (having a higher speed, or a lower viscosity [35])).
F. Roddier goes even further by asserting that this “maximum energy dissipation principle” will be detrimental for our societies depending on the amount of energy we will have at our disposal: with an abundant source like nuclear fusion, it would probably be imperative to pass to a system that encompasses more than planet Earth, to avoid a process of very rapid dissipation of energy in a closed system (Earth), i.e. a total war [36].

IV. The entropocene, B. Stiegler

[addendum: one month after writing these lines, I came across the very good paper by J.H. Robbins (in English), who also introduced these ideas, especially the Entropocene! In particular, the fact that technologies — negentropic and ordered — would have found in us the reservoir where “to throw” their entropy [44]. ]

I must put here a little disclaimer: it is absolutely not my goal to state here a theory of everything in ecology, a sort of unwelcome scientism that ignores the human and social sciences. The goal is only to expose some mechanisms drawn from basic sciences like physics, which can provide some insights, which can highlight perspectives that deserve to be explored, without claiming to summarize the complexity of the anthropological, sociological and political underlying stakes, and especially without falling into fatalism (which is a characteristic of the mechanical dogma, and thermodynamics cannot subscribe to it!).


“The subject I wanted to embrace is immense; because it includes most of the feelings and ideas that give birth to the new state of the world. Such a subject exceeds certainly my strength; in treating him, I did not manage to satisfy myself.”
(Alexis de Tocqueville)

* to be honest, there are many others: Occidentalocene, Machinocene, Chimicocene, Industrialocene, Énergitocene, Molysmocène, Pyrocène
** Unless we should talk about sophistication? [19]
*** “Disorder” is growing, and that’s precisely what permits complexity to appear and endure for a long time” “[22]
**** This freedom comes from an uncertainty as to the evolution of everything, which can undoubtedly be linked to quantum mechanics [45]: it precisely postulates (via the Copenhagen’s interpretation) that the state of the particles is not defined until we measure it. These measures create information, and may be behind the creation of entropy. [31]

Bonus (extension):

[in French, there is a play-on-word with “en trop”, and an expansion with “entre-soi”, see the French original article’s end]

Another interesting avenue has been the proliferation of information in recent years. “Today, we know everything, but we do nothing”. We have multiplied information [42] (also for dating [41]), and the information channels. We are therefore saturated by it: the brain can no longer follow, it retreats in simplicity. The “fake news”? It allows you to multiply information in little time, because no need to verify it, or to study facts. For N × M energy, one can write N × M articles with complexity of 1 (fake news), or N articles with complexity of M (elaborated). We can consider this duel complexity / disorder as a marker of the entropocene.


[2] : J. Diamond, “The Worst Mistake in the History of the Human Race” (1987)

[3] :


[5] P. Jacquet, “Birth of information theory” (from French)

[6] The Shift Project, “Lean ICT”,

[7] (FR) Patricia Crifo, Michele Debonneuil, Alain Grandjean, « Croissance verte », Novembre 2009.


[9] Benett et al., “The broiler chicken as a signal of a human reconfigured biosphere” RSP (2018),

[10] Visconti, G., “Anthropocene: another academic invention?”, Rend. Fis. Acc. Lincei (2014) 25: 38,

[11] R. Dzombak, “Should the age of humans have a geologic name?” (2019),

[12] “Anthropocene or Capitalocene?: Nature, History, and the Crisis of Capitalism” (2016),

[13] (FR) Vincent Mignerot, “Anthropocène, Capitalocène ou… Dérégulocène ?” (2019),

[14] (FR) “Ni anthropocène, ni capitalocène : le problème, c’est le mégalocène”,

[15] Nicolas Hulot & Pablo Servigne & Alexia Soyeux au Climax Festival (2019)

[16] “Thermodynamics Quotes”,

[17] (FR) Pierre de Menten de Horne, « Dictionnaire de chimie: Une approche étymologique et historique », (2015) ISBN 978–2804181758,

[18] Erwin Schrodinger, Roger Penrose — “What is Life?”-Cambridge University Press (1992),

[19] (FR) Lê Nguyen Hoang, “La formule du Savoir” (2018),

[20] M. Mohr, “The Mystery of Entropy: How to Measure Unpredictability in Machine Learning” (2019),

[21] (FR) V. Mignerot et L. Semal, « Anthropocène, Capitalocène ou… Dérégulocène ? »

[22] S. Caroll, « The Big Picture: On the origin of life, meaning, and the universe itself » (2016),

[23] “The Cosmic Genesis of Technology” (2009),

[24] (FR) B. Stiegler, « De l’entropie à la néguentropie »,

[25] (FR) B. Stiegler « Sortir de l’anthropocène », Multitudes (2015)

[26] CrashCourse, “The Anthropocene and the Near Future: Crash Course Big History #9”,

[27] De Wever, Patrick, and Stan Finney. “The Anthropocene: a Geological or Societal Subject?.” Biodiversity and Evolution. Elsevier, 2018. 251–264,

[28] Bonnet, RM, Woltjer, L., “Surviving 1000 CenturiesCan We Do It?”, Springer (2008),

[29] The Guardian, “The Anthropocene epoch: scientists declare dawn of human-influenced age” (2016)

[30] Nature news, “Humans versus Earth: the quest to define the Anthropocene”,

[31] Veritasium, “What is NOT random”,


[33] John Harte, “Maximum Entropy and Ecology: A Theory of Abundance, Distribution, and Energetics”, OSEE (2011),

[34] Zupanovic, “The Maximum Entropy Production Principle and Linear Irreversible Processes”, Entropy (2011),

[35] Martyushev & Seleznev, “The restrictions of the Maximum Entropy Production Principle” Physica A: Statistical Mechanics and its Applications 410 (2014): 17–21,

[36] (FR) F. Roddier, « Le syndrome de la reine rouge » (2012), (fr)

[39] Ichiro Aoki, “Entropy Principle for the Development of Complex Biotic Systems — Organisms, Ecosystems, the Earth”, Elsevier Insights (2012),

[40] the desire to describe physics using beautiful mathematical equations is perhaps a problem, see

[41] Huffpost, “7 Ways To Deal With Dating Burnout” (2013),

[42] Lifehack, “How to Stop Information Overload and Get More Done” (2019),

[43] England, J. Dissipative adaptation in driven self-assembly. Nature Nanotech 10, 919–923 (2015),

[44] J.H. Robbins, “The Entropocene” (2017),

[45] Ettore Majorana, “The value of statistical laws in physics and social
sciences”, Translated from “Scientia”, vol. 36, 1942, pp. 58–66, by R. N. Mantegna in “Quantitative Finance” 5 (2005) 133–140,

[46], Peter Atkins

[47] Here we obviously think of quotes like those of N. Georgescu-Roegen: “It is as if the human species were determined to have a short but exciting life. Let the less ambitious species have a long but uneventful existence.” in The Entropy Law and the Economic Problem (1971).

[48] H. T. Odum, “Environment, Power, and Society for the Twenty-First Century: The Hierarchy of Energy” (2007),

[49] J. E.J. Schmitz, “The Second Law of Life — Energy, Technology, and the Future of Earth As We Know It” (2007)

Low-techs, solutions basées sur les écosystèmes, biologie évolutionniste, énergie/climat…

Low-techs, solutions basées sur les écosystèmes, biologie évolutionniste, énergie/climat…