Anthropocene or entropocene? (en)
“Anthropy” and “entropy”, it’s the same pronunciation in French, a bit different in English but it remains similar [8]. One relates to Man (Homo Sapiens), the other designates a thermodynamic quantity. No connection? Indeed, until we had to name the new geological era [26] that we, humans, are slowly creating (pollution, destruction of living conditions, etc.).
This article is a translation of this one, originally in French.
Outline
I. Deregulatocene, capitalocene and others
II 1. Entropy, a measure of disorder?
II 2. The… negentropy?
II 3. Entropy vs complexity
III 1. Entropy: to absolutely avoid?
III 2. Can we “moderately consume” entropy?
IV. The entropocene
Conclusion
I. Deregulatocene, capitalocene or others
While some would like to name it after a specific place, or to move to a more “future-oriented” era (the “Alienocene” of F. Neyrat or the “Chthulucene” by D. Haraway (“khthonios”, meaning “from the Earth” ), others have proposed the “Plantationocene” for “plantations in the Americas as a key stage in the transition to a new geological era” [3], or even the “Myxocene” [1] (muxa, =“vase”) for dead zones infested with jellyfish and toxic algae*. The Anthropocene could also be defined in several different ways: 1st atomic bomb, large extinctions, control of fire [29], accumulation of plastics or even … chicken bones [9].
Its definition therefore varies widely, as does its start date: some scientists are therefore reluctant to baptize this new era [27]. Especially since it will simply be too short compared to the current definition [10]: geologically, the duration of Homo Sapiens is of a few thousand centuries [28], and its self-destruction in just a few centuries could be more perceived as a “crisis” between two eras [10].
The suffix –cene is therefore debatable, but the definition of this new era seems to have already made a break through in many academic circles [30]. Outside the natural sciences, it is rather “anthropos” (“man” in Greek) which is the subject of debate: are the damages, inflicted to the Earth system, really attributable in an equal manner to the populations of the countries of the South, to the “losers” of globalization, to the victims (past and future) of imperialism, or even to the indigenous peoples or hunter-gatherers still existing in various latitudes? [11, 29]
The “Capitalocene” was proposed as a replacement [12], by opposing capitalism to the tribal, communal and then feudal systems, but the communist system without capital accumulation has been proven to be just as destructive of nature.
So what ? The “Megalomaniacocene” [11], the age when the madness of a few was imposed to the majority? Or the “Deregulatocene” the moment when “the capacities for the progressive acquisition of a particular species were no longer regulated by interactions of this species with the whole biosphere”, perhaps? [21] This approach suggests not to accuse mental structures, “abstractions” (like capitalism), because they would just be some mean to discard our responsibility, and would obscure important evolutionary mechanisms. However, this vision obscures the fact that certain human behaviors can be more harmful than others environmentally speaking (“if everyone is guilty, no one is guilty”).
Finally, against or in addition to the anthropocene, the “entropocene” has been proposed. But to detail its strengths and weaknesses, we must first explain what is entropy.
II.
II.1. Entropy: a measure of disorder?
Entropy is a thermodynamic quantity which quantifies how many different ways we can arrange elementary particles giving ultimately the same properties on a large scale. This concept was extensively discussed, because it is the basis for the thermodynamics laws: they indicate that, without additional contribution coming from an external medium, the universe (and all thing) tends irreversibly to return to equilibrium (i.e. death), having dissipated its useful energy.
Entropy could even be a measure of uncertainty, of ignorance, of the unpredictable [20]. The ways of arranging particles in a physical system have also be used (by analogy) in information theory, and it was then possible to define the entropy of a binary dataset. However, even if entropy has been extended to domains other than thermodynamics throughout history, a “semantic cleaning” seems to less and less associate it with “disorder” (thus narrowing the definition). We could then define this quantity as what prevents a system from changing (or transforming) spontaneously [46] (see Peter Atkins).
II.2. Negentropy ?
Definition: Evolution of a system which presents an increasing degree of organization. Synon. Negative entropy.
The French L. Brillouin, drawing inspiration from the work of the American C. Shannon, baptizes “negentropy”, the quantity of information that enable a structuring of physical systems, by evacuating their entropy [4]. Similarly, the Austrian E. Schrödinger (the same that dislike cats) noted that life, by self-organizing, thus “fights” against the 2nd law of thermodynamics — that states the global entropy must increase — by “absorbing negentropy ”[18]. But on a local scale, and thanks to the continued support of the Sun’s nuclear reactions and Earth’s geological activity. The notion of negentropy is therefore necessarily limited in time or space, or only applies to an open system because global entropy increases anyway. In this wake, mechanical physicists conceptualized in 1876 the “Maxwell demon”, a superhuman entity which could separate the molecules of a mixture according to their properties (speed, mass …): thus, this “demon” could re-separate the coffee from the milk of your cappuccino, reducing the entropy of the system {cappuccino}. But the information used to reduce the entropy of this system (limited in space) was constituted thanks to an external energy source (by measurements and calculations on a computer for example), therefore with creation of entropy (in the form of heat and dissipation), greater than the “negentropy” created in the cappuccino. In any case, the 2nd principle of thermodynamics still applies.
II.3. Entropy vs complexity
The entropy of an image is maximum if it is completely random [31] (therefore maximally disordered), and can then be defined very simply (“there are 0s and 1s everywhere ”= simple definition), but it nevertheless contains maximum information (in Shannon’s sense, i.e. in the physical sense). At the opposite extreme, a single tint image (e.g. all white) contains the minimum of information possible, therefore a minimum entropy. The intermediate case is the most interesting, because it contains all kinds of complex patterns (see also here for some examples).
This is also true for thermodynamics: fully mixed cappuccino is the most messy and entropic, but not the most complex (because there is milk and coffee at every place). The most complex is the intermediate case, where the turbulence of the mixing process creates complex, fractal structures [22], that are rather pretty (see below).
The information required to describe the system {cappuccino} grows hand-in-hand with entropy: very simple at the start (bottom: coffee, top: milk), larger in the middle (coffee in islets of milk, and vice versa …), and becomes maximum at the end of the mixing (no recognizable redundancy, just coffee-milk).
The information, even if it has a negative sign in the definition of Shannon, is indeed a real entropy in the thermodynamic sense: it is the result of increased disorder. The problem comes from the fact that we don’t necessarily have a good representation of “disorder”. To increase the disorder of a system (therefore its information) can also mean to make it useful and intelligible for us, to assemble things between them to give birth to other properties, by emergence. So the information seems to be a regression of entropy (a “negentropy”) only when it becomes non-random, when it becomes something complex enough to be useful, and with beauty. At least for us humans. A machine with infinite computing power (like the “Maxwell demon”) could probably apprehend the universe by measuring the state of each particle which compose it, but we are forced to find laws, mathematical and physical formulas to condense information as much as possible and make it intelligible. So, we increase the complexity, it makes sense for us, and we find it beautiful [40].
However, we are not looking for maximum complexity, but a complexity that can be described as “optimal”: we first collect data (e.g. positions of the planets), then we identify patterns, regularities which can be translated into physical formulas (ex: Kepler’s formulas), and finally we can eventually deduce a general law (ex: Newton’s universal gravity) [33]. During this process, we greatly reduce the information (in the physical sense, therefore entropy), and we reach a small degree of complexity. But these general laws are hardly applied as such in practice: in the case of Newton for example (2nd law of dynamics), it is often preferred to use a more complex but more practical formula for each particular case (Bernoulli’s law for certain fluids for example), slightly increasing the complexity. To come to conceptualize these beautiful formulas (and therefore to reduce entropy in the sense of information), you also need an external energy supply.
Information (in the human sense) is then more entropic than a uniform state (of zero entropy, without information), but not too much (to remain intelligible, it must not tend towards too much disorder). Negentropy therefore seems to have its baseline to the state of maximum entropy, and not to the state of zero entropy.
To further understand that “information (in the physical sense) = entropy”, we can see that the information required to describe the Universe is growing since the Big Bang (where it was condensed in a “point”, so zero information described it). Thus, information-entropy is indeed growing. But the information (human-readable) may decrease if the maximum complexity has been reached. Mathematically, this complexity is the one of Kolmogorov (or Solomonoff [19]), it is a purely formula definition, but which fits rather well when it comes to rigorously defining what is defined by “complexity” **.
In summary, information in the human sense increases if complex structures are assembled from uniform, or on the contrary from random systems. However, in the second case, the information in the physical sense (and therefore the entropy) decreased (see table below). This conceptual contradiction comes from the fact that we have lots of definitions of “information”, sometimes contradictory (see [22] p. 280), while information in the physical sense (by Shannon) is well-defined.
Negentropy is also a rather vague concept, because it does not have the same meaning if we consider information in a purely physical or human sense. We will summarize it as follows: an external energy supply can be used to order things so that new properties emerge (e.g. life), and we can call this new information “negentropy”. Abusely, the energy supply may be called negentropy. This information may have increased the entropy of the system (creation of pure physical information), or may have decreased it (in this case entropy is still created outside the system in the form of thermodynamic dissipation), see table above.
III.1. Entropy: to avoid at all costs?
Entropy seems to be a rather neutral notion. In the first place it allows to mix, to assemble things: appearance of life for example, which is a certain order (succession of bases in DNA), but more disordered than heaps of atoms of carbon, oxygen, nitrogen (and every other elements). At the end, it ends up disorganizing all that (too intense mixture), but after having passed the maximum of complexity. To consume sparingly, therefore.
Entropy could then be defined by the degree of mixing of a system: at first it creates useful information, then gradually unintelligible. But only at the global level. A non-isolated system (like the Earth) receives and dissipates energy from the Sun globally to reduce, in small islands, the local entropy. It is therefore essential to separate the local from the global: we can have the illusion of lowering the entropy at a place on Earth, while increasing it globally.
Digital systems, which are a priori “negentropists”, therefore multiply (useful) information by rejecting entropy in another form: CO2, and a dispersion of materials especially metals (these materials are present in very small quantities in the devices). So, contrary to the idea that ICT will be “the solution to the environmental problems that our industrial era has imposed to our planet” [5], it must be understood that — even if the source of energy (the Sun) is external to the Earth system — the entropy resulting from the process of information creation remains in the system (in the form of pollution, CO2 + material dispersion). In addition, ICT is a facilitator of the economy, thus it also facilitates CO2 emissions [6], which are very stable and simple molecules, dispersed, and without preferential direction (it’s a gas), therefore quite entropic (and “anthropic” for some, here’s the problem!). We are unable to evacuate the (thermodynamical) entropy of the digital world, yet creator of useful information for humans (thus potentially increasing local order). More broadly, we cannot eliminate the entropy excess that we create in the form of CO2, or of heat…
Because it is certain that we would give little importance to increase the entropy outside our planet, even exponentially, as long as our Earth system (the only suitable place for our species, which has evolved there for millions of years) does not degrade. We create a lot of “entropy surplus” because it involves a transformation and dispersion of materials (fossil resources, ores) that is almost instantaneous compared to their formation time (10 million years), and compared to their total re-transformation time (e.g. up to 10,000 years for CO2).
We may have an evolving bug: with the mastery of fire, we previously used to reject entropy mainly in the form of heat (and a compensable amount of CO2), from an external source (the Sun), which could be easily compensated by the ecosystems. But since the fossil fuel era, the release of entropy is too fast, and we are not accustom (evolutionarily) that our environment cannot keep pace to eliminate it.
We should discuss that entropy also defines another important principle: irreversibility. In general, the larger the systems (macroscopic), the less they are reversible, and therefore the more they create entropy. To increase one’s translation speed, you need a machine (human body, vehicle) whose energy consumption increases with size, therefore that produces more and more entropy: friction in a fluid (dissipation in the form of heat) thus evolves first proportional to the speed, then to its square (power two), if it exceeds a certain threshold (called “turbulence”). We can therefore think that limiting the creation of entropy is to be more sustainable, to have less losses. Photosynthesis is for example a quasi-reversible process, which minimizes losses.
But irreversibility is still necessary, because it defines the “arrow of time” [22] (the fact that the processes take place in one direction and not in the other). A process is irreversible because it produces entropy, and this enables possible future states of the Universe *** (because information in the physical sense -and uncertainty- increase): it thus avoids absolute determinism of any subsequent state [31, 45]****. We may have Free Will thanks to entropy (i.e. an ability to act and think freely and independently).
In a more philosophical view, one could say that one aspect of the entropocene is precisely an increase in Free Will, by a great expenditure of useful energy: to exercise ever more your Free Will, like buying a big polluting car, fly to Cancun every year, eat what you want all the time, etc., you always have to create more entropy. Once again, we must be careful not to abuse it: entropy first creates irreversibility in the form of physical information, which allows Free Will, and multiplies the possible futures, but beyond a certain threshold, the interesting futures for the human species become rare. This is because the unpredictable and the physical uncertainty become too high (the quality of the information having been deteriorated).
III.2. Can we consume entropy sparingly? (the maximum entropy production principle)
An increasingly accepted principle in natural sciences states that an organism maximizes the flow of energy passing through it, as this allows it to produce a maximum of entropy. Even if this maximum entropy production principle is not unanimous in the scientific community - both for its demonstration and its field of application - it nevertheless has some solid bases and is considered useful since it allows to explain many phenomena [35, 43, 48].
Increasingly complex systems would therefore dissipate more and more energy, probably because they create more and more information. There is a correlation, but is there real causation? Are systems doomed to dissipate as much energy as possible [47], or is energy dissipation just an emerging characteristic of these systems [48]? The arrow of time seems in any case to favor the emergence of the most dissipative systems possible (see figure) [32].
But this principle actually applies to systems far from the equilibrium state [36]. In times of abundance, where energetic flows are possible, we tend to maximize the rejection of entropy and the energy dissipation. When the flows are blocked, or close to equilibrium (shortages), we rather minimize (see right curve above) [39]. Equivalently, minimizing energy expenditure close to equilibrium is equivalent to maximizing dissipation fluxes very far from equilibrium [34]. A process would thus always dissipate a maximum of energy according to its capacities, and of the energy which it has at its disposal (taking again the preceding example, the turbulent flow of a fluid dissipates therefore more than a laminar flow because “it can afford it” (having a higher speed, or a lower viscosity [35])).
F. Roddier goes even further by asserting that this “maximum energy dissipation principle” will be detrimental for our societies depending on the amount of energy we will have at our disposal: with an abundant source like nuclear fusion, it would probably be imperative to pass to a system that encompasses more than planet Earth, to avoid a process of very rapid dissipation of energy in a closed system (Earth), i.e. a total war [36].
IV. The entropocene, B. Stiegler
Bernard Stiegler takes up the principle of negentropy [24, 25] and extends it to the principle of automation. According to him, what is automated becomes closed, so there is destruction of knowledge (“in the sense of etiquette, know-how and know-how-to-conceptualize”). In addition, as in thermodynamics, entropy increases without limit in a closed system: automation would therefore be the “engine” of the entropocene, and it would be necessary to return to “knowledge” which is negentropic, because open [23].
We now know that negentropy works mostly thanks to a large supply of external energy, and that entropy is created anyway (2nd principle of thermodynamics). As for me, I think that the current globalized system considers itself as a closed system compared to the Earth system, and therefore rejects its entropy to the latter without limits. For this to stop, Humans must therefore first understand that they are fully integrated and coupled to their environment.
[addendum: one month after writing these lines, I came across the very good paper by J.H. Robbins (in English), who also introduced these ideas, especially the Entropocene! In particular, the fact that technologies — negentropic and ordered — would have found in us the reservoir where “to throw” their entropy [44]. ]
I must put here a little disclaimer: it is absolutely not my goal to state here a theory of everything in ecology, a sort of unwelcome scientism that ignores the human and social sciences. The goal is only to expose some mechanisms drawn from basic sciences like physics, which can provide some insights, which can highlight perspectives that deserve to be explored, without claiming to summarize the complexity of the anthropological, sociological and political underlying stakes, and especially without falling into fatalism (which is a characteristic of the mechanical dogma, and thermodynamics cannot subscribe to it!).
Conclusion
We have seen that defining a new geological era “anthropo-cene” presented problems both for its prefix “anthropo-” and its suffix “-cene”. However, it still seems useful to consider this era scientifically, to put a word on the environmental crisis that we are fueling.
We have then defined entropy, and showed that it is in fact a fairly neutral notion, which goes against the usual caricatures of “increase of chaos” that is associated to it: entropy allows the arrow of time, and probably to have Free Will. In addition to being a characteristic of the mixture of bodies, its definition also extends to the field of information, by resolving a paradox: information in the physical sense is indeed an entropy strictly speaking, it is information in the human sense (intelligible, useful) which can be a “negentropy”, and disorder can therefore decrease by adding information if we start from a disordered, random, mixed system (but can increase if we start from a simple system). This is explained by the notion of complexity, which tends to increase with entropy, but then decrease after a certain threshold: information in the human sense has a certain complexity to make sense. Planet Earth is the most complex and complete (the only one with craters, volcanoes, magnetic field, moon, atmosphere [28]), which has allowed the emergence of Life. Complexity is associated for us with beauty.
Information in any form is therefore associated with entropy, which can be rejected outside the system in the form of various physical pollution (ex: CO2, dilution of pollutants etc.): the whole issue is therefore to well define the system, which is never completely isolated if we are on Earth, so as not to reject an entropy (that we consider being eliminated) outside, while it remains in our close system. The numerous fires that are ravaging the world today are also indicators of an intense production of entropy.
Finally, a principle that is increasingly used stipulates that natural systems self-organize to produce ever more entropy, but only in response to the useful energy they can process. The entropocene is thus perhaps, paradoxically, the moment when we can use too much cheap energy, when we can exercise our Free Will too freely, oblivious to the consequences, or rather not evolutionarily designed to perceive them. Nor are we evolutionarily designed to process the (huge) amount of information that we now create.
Our era therefore does not only mark an excessive production of entropy, but perhaps rather the lack of sense of this production. For this reason and for all those that we have mentioned, we are indeed in a major crisis (on the geological time scale), and it may be appropriate to call it “entropocene”.
“The subject I wanted to embrace is immense; because it includes most of the feelings and ideas that give birth to the new state of the world. Such a subject exceeds certainly my strength; in treating him, I did not manage to satisfy myself.”
(Alexis de Tocqueville)
* to be honest, there are many others: Occidentalocene, Machinocene, Chimicocene, Industrialocene, Énergitocene, Molysmocène, Pyrocène
** Unless we should talk about sophistication? [19]
*** “Disorder” is growing, and that’s precisely what permits complexity to appear and endure for a long time” “[22]
**** This freedom comes from an uncertainty as to the evolution of everything, which can undoubtedly be linked to quantum mechanics [45]: it precisely postulates (via the Copenhagen’s interpretation) that the state of the particles is not defined until we measure it. These measures create information, and may be behind the creation of entropy. [31]
Bonus (extension):
Entropy, a word coined by Clausius in 1865 from the Greek “turning inside out” + “change, revolution” “direction, mode”, from an Indo-European root trek, trok, “twist”. The ending -ie is given by analogy with energy. [17]
We could try to broaden “entropo-” also to “enterprise”, that is to say the moment when certain homo sapiens created imaginary structures (abstractions) having for strategy to provide goods or services. We find the principle of “direction”, but mainly the “transformation”.
This would set the beginning of the “entropocene” long before the “Anthropocene” one, because agriculture (the worst mistake of humanity, according to the famous sentence of J. Diamond [2]) can for example be seen from this point of view, in contrast to a hunter-gatherer lifestyle. Some people will probably prefer the start of the cognitive revolution. In any case, “enterprise” is not equal to capitalism (and the entropocene to capitalocene), because it does not suppose infinite accumulation, private property (in the broad sense), and that collectivist modes can also be considered as enterprises (in the broad sense). Unfortunately, purists will rightly point out that the prefix is entropo- and not entrepri-. Enterprise is “taking your hands”, it comes from “taking hold”. There is a bit of a “change of direction”, but the root is still not entirely equivalent.
[in French, there is a play-on-word with “en trop”, and an expansion with “entre-soi”, see the French original article’s end]
Another interesting avenue has been the proliferation of information in recent years. “Today, we know everything, but we do nothing”. We have multiplied information [42] (also for dating [41]), and the information channels. We are therefore saturated by it: the brain can no longer follow, it retreats in simplicity. The “fake news”? It allows you to multiply information in little time, because no need to verify it, or to study facts. For N × M energy, one can write N × M articles with complexity of 1 (fake news), or N articles with complexity of M (elaborated). We can consider this duel complexity / disorder as a marker of the entropocene.
References:
[1] : T. E. Hill, “Myxocene” (2015), https://www.goodreads.com/book/show/27406604-myxocene
[2] : J. Diamond, “The Worst Mistake in the History of the Human Race” (1987) http://www.ditext.com/diamond/mistake.html
[3] : https://edgeeffects.net/haraway-tsing-plantationocene/
[4] https://en.wikipedia.org/wiki/Negentropy
[5] P. Jacquet, “Birth of information theory” (from French) https://translate.google.ca/translate?sl=fr&tl=en&u=https%3A%2F%2Fjournals.openedition.org%2Fbibnum%2F568
[6] The Shift Project, “Lean ICT”, https://theshiftproject.org/en/lean-ict-2/
[7] (FR) Patricia Crifo, Michele Debonneuil, Alain Grandjean, « Croissance verte », Novembre 2009.
[8] https://sites.google.com/site/etymologielatingrec/home/a/anthropie
[9] Benett et al., “The broiler chicken as a signal of a human reconfigured biosphere” RSP (2018), https://royalsocietypublishing.org/doi/10.1098/rsos.180325
[10] Visconti, G., “Anthropocene: another academic invention?”, Rend. Fis. Acc. Lincei (2014) 25: 38, https://link.springer.com/article/10.1007/s12210-014-0317-x
[11] R. Dzombak, “Should the age of humans have a geologic name?” (2019), https://massivesci.com/articles/marks-of-the-anthropocene-climate-change-global-warming/
[12] “Anthropocene or Capitalocene?: Nature, History, and the Crisis of Capitalism” (2016), https://tinyurl.com/vgkbh3c
[13] (FR) Vincent Mignerot, “Anthropocène, Capitalocène ou… Dérégulocène ?” (2019), https://vincent-mignerot.fr/anthropocene-capitalocene-ou-deregulocene/
[14] (FR) “Ni anthropocène, ni capitalocène : le problème, c’est le mégalocène”, https://www.partage-le.com/2018/04/30/9279/
[15] Nicolas Hulot & Pablo Servigne & Alexia Soyeux au Climax Festival (2019) https://youtu.be/73smx5C8YHw?t=1141
[16] “Thermodynamics Quotes”, https://todayinsci.com/QuotationsCategories/T_Cat/Thermodynamics-Quotations.htm
[17] (FR) Pierre de Menten de Horne, « Dictionnaire de chimie: Une approche étymologique et historique », (2015) ISBN 978–2804181758, https://tinyurl.com/wo9luqw
[18] Erwin Schrodinger, Roger Penrose — “What is Life?”-Cambridge University Press (1992), https://tinyurl.com/w829z54
[19] (FR) Lê Nguyen Hoang, “La formule du Savoir” (2018), https://www.goodreads.com/book/show/40529387-la-formule-du-savoir
[20] M. Mohr, “The Mystery of Entropy: How to Measure Unpredictability in Machine Learning” (2019), https://www.inovex.de/blog/the-mystery-of-entropy-how-to-measure-unpredictability-in-machine-learning/
[21] (FR) V. Mignerot et L. Semal, « Anthropocène, Capitalocène ou… Dérégulocène ? » https://www.youtube.com/watch?v=GVXt2o6AGGY
[22] S. Caroll, « The Big Picture: On the origin of life, meaning, and the universe itself » (2016), https://tinyurl.com/s98ux4v
[23] “The Cosmic Genesis of Technology” (2009), https://kk.org/thetechnium/the-cosmic-gene/
[24] (FR) B. Stiegler, « De l’entropie à la néguentropie », https://www.youtube.com/watch?v=PW6fA9NjNQI
[25] (FR) B. Stiegler « Sortir de l’anthropocène », Multitudes (2015) https://www.multitudes.net/sortir-de-lanthropocene/
[26] CrashCourse, “The Anthropocene and the Near Future: Crash Course Big History #9”, https://www.youtube.com/watch?v=3WpaLt_Blr4
[27] De Wever, Patrick, and Stan Finney. “The Anthropocene: a Geological or Societal Subject?.” Biodiversity and Evolution. Elsevier, 2018. 251–264, https://www.sciencedirect.com/science/article/pii/B9781785482779500140
[28] Bonnet, RM, Woltjer, L., “Surviving 1000 CenturiesCan We Do It?”, Springer (2008), https://www.springer.com/gp/book/9780387746333
[29] The Guardian, “The Anthropocene epoch: scientists declare dawn of human-influenced age” (2016) https://www.theguardian.com/environment/2016/aug/29/declare-anthropocene-epoch-experts-urge-geological-congress-human-impact-earth
[30] Nature news, “Humans versus Earth: the quest to define the Anthropocene”, https://www.nature.com/articles/d41586-019-02381-2
[31] Veritasium, “What is NOT random”, https://www.youtube.com/watch?v=sMb00lz-IfE
[32] https://en.wikipedia.org/wiki/Maximum_entropy_thermodynamics
[33] John Harte, “Maximum Entropy and Ecology: A Theory of Abundance, Distribution, and Energetics”, OSEE (2011), https://books.google.ca/books?id=uKEke6zrfBYC
[34] Zupanovic, “The Maximum Entropy Production Principle and Linear Irreversible Processes”, Entropy (2011), https://arxiv.org/abs/1003.3680
[35] Martyushev & Seleznev, “The restrictions of the Maximum Entropy Production Principle” Physica A: Statistical Mechanics and its Applications 410 (2014): 17–21, https://arxiv.org/abs/1311.2068
[36] (FR) F. Roddier, « Le syndrome de la reine rouge » (2012), https://www.institutmomentum.org/wp-content/uploads/2013/10/le-syndr%C3%B4me-de-la-reine-rouge.pdf (fr)
[39] Ichiro Aoki, “Entropy Principle for the Development of Complex Biotic Systems — Organisms, Ecosystems, the Earth”, Elsevier Insights (2012), https://tinyurl.com/rjtpjcb
[40] the desire to describe physics using beautiful mathematical equations is perhaps a problem, see https://blog.usejournal.com/has-physics-gone-astray-60cfca8ad62c
[41] Huffpost, “7 Ways To Deal With Dating Burnout” (2013), https://www.huffpost.com/entry/tk-ways-to-deal-with-dating-burnout_b_4059561
[42] Lifehack, “How to Stop Information Overload and Get More Done” (2019), https://www.lifehack.org/articles/productivity/how-to-fight-information-overload.html
[43] England, J. Dissipative adaptation in driven self-assembly. Nature Nanotech 10, 919–923 (2015), https://www.nature.com/articles/nnano.2015.250
[44] J.H. Robbins, “The Entropocene” (2017), http://journals.isss.org/index.php/proceedings61st/article/view/3232
[45] Ettore Majorana, “The value of statistical laws in physics and social
sciences”, Translated from “Scientia”, vol. 36, 1942, pp. 58–66, by R. N. Mantegna in “Quantitative Finance” 5 (2005) 133–140, https://link.springer.com/chapter/10.1007/978-3-540-48095-2_11
[46] https://www.cs.mcgill.ca/~rwest/wikispeedia/wpcd/wp/e/Entropy.htm, Peter Atkins
[47] Here we obviously think of quotes like those of N. Georgescu-Roegen: “It is as if the human species were determined to have a short but exciting life. Let the less ambitious species have a long but uneventful existence.” in The Entropy Law and the Economic Problem (1971).
[48] H. T. Odum, “Environment, Power, and Society for the Twenty-First Century: The Hierarchy of Energy” (2007), https://www.jstor.org/stable/10.7312/odum12886
[49] J. E.J. Schmitz, “The Second Law of Life — Energy, Technology, and the Future of Earth As We Know It” (2007)