Definition of

Entropy

Disorder

Entropy measures the chaos of a system.

Entropy is a notion that comes from a Greek word that can be translated as "return" or "transformation" (used in a figurative sense).

In the 19th century , Clausius coined the concept in physics to refer to a measure of the disorder that can be seen in the molecules of a gas. From then on this concept would be used with various meanings in multiple sciences, such as physics , chemistry , computer science , mathematics and linguistics .

Some definitions of entropy

Entropy can be the thermodynamic physical quantity that allows measuring the unusable part of the energy contained in a system . This means that this part of the energy cannot be used to produce work.

Entropy is also understood as the measure of disorder of a system . In this sense, it is associated with a degree of homogeneity.

The entropy of formation of a chemical compound is established by measuring the entropy that makes up each of its constituent elements. The greater the entropy of formation, the more favorable its formation will be.

Shannon entropy

Information entropy is also known as Shannon entropy.

The concept and information

In information theory, entropy is the measure of the uncertainty that exists in a set of messages (of which only one will be received). The measurement in question is necessary to reduce or eliminate uncertainty.

Another way to understand entropy is as the average amount of information contained in transmitted symbols . Words like "the" or "that" are the most frequent symbols in a text but, however, they are the ones that provide the least information. The message will have relevant information and maximum entropy when all symbols are equally probable.

Molecules

Boltzmann's entropy formula shows the link between entropy and the number of ways in which the molecules or atoms of a thermodynamic system can be organized.

Entropy in the field of linguistics

The way in which information is organized and disseminated in a discourse is one of the most relevant and amenable to research topics for linguistics. And thanks to entropy, a deeper analysis of communication can be carried out.

In the case of written communication, the problem is simple to analyze (the basic units, letters, are well defined); If you want to thoroughly understand the message, it is possible to decode it accurately and understand both what is said literally and figuratively . But in oral language, things change a little, presenting some complications.

It is not easy to determine the fundamental elements of the code in oral discourse ; Words sound different depending on who pronounces them and, likewise, they can have different meanings. It is not enough, therefore, to classify them into vowel and consonant phonemes because this would not allow us to understand how the information is organized since, for example, if vowel phonemes are deleted, it is not possible to understand the message.

Understanding the code

According to a study carried out at the University of Wisconsin-Madison, a good way to isolate and understand the oral code is through the spectral decomposition of sound signals. Thanks to this technique, we try to understand how the cochlea filters and analyzes what comes to it. The cochlea is the part of our ears that has the function of transforming sounds into electrical signals and sending them directly to the brain.

To carry out this experiment, a unit of measurement known as cochlear scale spectral entropy ( CSE ) was used, which allows connections to be established between a signal and the one that precedes it , deciding what possibilities there are of predicting a signal based on the previous one. .

The results showed that the more similar two signals are, the easier it is to predict the second ; This means that the information we take from the second is almost zero. Likewise, the more they differ from each other, the greater the information provided by the second signal, so if it is eliminated it will have considerable consequences on the understanding of the speech.

Entropy and the principles of thermodynamics

Four laws are called principles of thermodynamics that are responsible for the definition of the fundamental physical quantities that allow the characterization of a thermodynamic system. These laws describe the behavior of systems under certain circumstances.

There is a zeroth law of thermodynamics that indicates that, when two systems are in thermal equilibrium - each independently - with a third system, these systems must also be in thermal equilibrium with each other. This postulate is important for the definition of temperature .

The first law of thermodynamics , for its part, alludes to the fact that a closed system can carry out an energy exchange with its environment through heat and work , which implies that it accumulates internal energy .

The second law of thermodynamics maintains that the entropy of the universe always shows a tendency to increase.

Finally, the third law of thermodynamics states that the entropy of a system approaches a constant value when the temperature, for its part, approaches absolute zero . At this point it should be noted that the entropy of a system in this instance is usually close to 0, except in those solids that are not crystalline.