Definition of

Negentropy

Negentropy removed the chaos

Negentropy is a force that keeps chaos away.

The concept of negentropy is not part of the dictionary of the Royal Spanish Academy ( RAE ). However, it is frequently used to refer to a specific type of entropy .

Entropy

Now, what is entropy ? The RAE provides two definitions, both used in the field of physics . The idea of ​​entropy can refer to the measure of the disorder that a system presents or to the thermodynamic magnitude that indicates the part of the energy that is not used to carry out work , expressed as the quotient between the heat that an element gives off and its absolute temperature. It is also possible to understand it as the tendency of systems to modify themselves according to their structure.

Negative entropy

Returning to the idea of ​​negentropy, this is what negative entropy is called, also referred to as syntropy , negantropy or negentropy . This is the entropy that, in order to preserve its own low entropy, a system takes advantage of.

It is important to mention that, according to systems theory , negentropy is a force that allows chaos to be warded off. While entropy pushes towards disorder, making it the most feasible state, negentropy aims to keep it away.

Negentropy appears as a tendency that allows a system to adapt its structure to reflect the levels that its subsystems have. Thus it tries to stabilize itself in the chaos and, therefore, guarantee its subsistence.

Although chaos never disappears, the system can appeal to negentropy to limit it to certain limits. An open system is one that favors the entry of negative entropy to reach a new equilibrium .

Distance from normality

In the fields of statistics and information theory, the concept of negentropy appears as an element that can measure the distance from normality . If, for example, we think of a Gaussian-type signal with a given distribution, it will be normal. In this case, negentropy cannot be negative and does not vary with linear changes in coordinates; Furthermore, it can fade only if the signal is Gaussian.

We must also mention the use of negentropy in signal processing , the discipline that is dedicated to the study and development of techniques to treat, analyze and classify signals, with a base in applied mathematics, in addition to statistics and theory of information. Here it is related to network entropy, which is used in a method that can separate a statistically independent, non-Gaussian multivariate signal into additive subcomponents (so-called independent component analysis ).

It is possible to understand negentropy intuitively as the information that we can store when we represent a variable that is not completely random, that is, we know certain information about it, so that it does not require a large length of data to represent it efficiently.

Negentropy removed the chaos

Negentropy is also used in statistics and signal processing.

Leon Brillouin

The physicist Léon Nicolas Brillouin , born in France in 1889. He studied at the University of Paris and the University of Munich and was part of the American National Academy of Sciences . Among his many works and contributions to the world of science, he published a work titled Science and Information Theory in 1959, in which he presents a study of the relationships between these fields and also touches on negentropy.

It presents a point of view that, from a position typical of physics, relates informational entropy to statistics , theories developed by Shannon and Boltzmann , respectively, pointing out that language and information represent a negentropic factor, given that through them it is possible to cancel entropy.