Entropy is a notion that comes from a Greek word that can be translated as “return” or “transformation” (used figuratively).
In the 19th century Clausius coined the concept in the field of Physics to refer to a measure of the disorder that can be seen in the molecules of a gas. From then on this concept would be used with different meanings in multiple sciences, such as the physical, the chemistry, the computing, the mathematics and the linguistics.
Some definitions are:
Entropy can be the thermodynamic physical quantity that allows to measure the unusable part of the Energy contained in a system. This means that this part of the energy cannot be used to produce work.
Entropy is also understood as the measure of system disorder. In this sense, it is associated with a degree of homogeneity.
The formation entropy of a chemical compound is established by measuring the one that makes up each of its constituent elements. The higher the formation entropy, the more favorable its formation.
In the theory of information, entropy is the measure of the uncertainty that exists before a set of messages (of which only one will be received). It is a measure of information that is necessary to reduce or eliminate uncertainty.
Another way to understand entropy is as the average amount of information contained in the transmitted symbols. Words like “the” or “what” They are the most frequent symbols in a text but, nevertheless, they are the ones that provide the least information. The message will have relevant information and maximum entropy when all symbols are equally probable.
Entropy in the field of linguistics
The way in which information is organized and disseminated in a speech is one of the most relevant and subject to research for linguistics. And thanks to entropy, a deeper analysis of communication can be performed.
In the case of written communication, the problem is easy to analyze (the basic units, the letters, are well defined); If you want to fully understand the message, you can decode it accurately and understand both literal and figurative words. But in oral language, things change a bit, presenting some complications.
It is not easy to determine the fundamental elements of the code in oral discourse; Words sound different depending on who is saying them and, in the same way, they can have different meanings. It is therefore not enough to classify them into vowel and consonant phonemes because this would not allow us to understand how the information is organized because, for example, if vowel phonemes are suppressed, it is not possible to understand the message.
According to a study carried out at the University of Wisconsin-Madison, a good way to isolate and understand the oral code is through the spectral decomposition of sound signals. Thanks to this technique, an attempt is made to understand how the cochlea filters and analyzes what reaches it. The cochlea is the part of our ears that has the function of transforming sounds into electrical signals and send them directly to the brain.
To carry out this experiment, a unit of measurement known as “cochlear scale spectral entropy” was used. (CSE), it allows make connections between a signal and the one preceding it; deciding what possibilities there are to predict a signal starting from the previous one.
The results returned that the more similar two signals are, the easier it is to predict the second; This means that the information we take from the second is almost nil. Likewise, the more they differ, the greater the information provided by the second signal, so if it is eliminated it will cause considerable consequences in the understanding of the speech.