Entropy
The entropy of an object is a measure of the amount of energy which is unavailable to do work. Entropy is also a measure of the number of possible arrangements the atoms in a system can have. In this sense, entropy is a measure of uncertainty or randomness. The higher the entropy of an object, the more uncertain we are about the states of the atoms making up that object because there are more states to decide from. A law of physics says that it takes work to make the entropy of an object or system smaller; without work, entropy can never become smaller – you could say that everything slowly goes to disorder (higher entropy).
The word entropy came from the study of heat and energy in the period 1850 to 1900. Some very useful mathematical ideas about probability calculations emerged from the study of entropy. These ideas are now used in information theory, statistical mechanics, chemistry and other areas of study.
Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the spreading of energy until it is evenly spread. The meaning of entropy is different in different fields. It can mean:
- Information entropy, which is a measure of information communicated by systems that are affected by data noise.
- Thermodynamic entropy is part of the science of heat energy. It is a measure of how organized or disorganized energy is in a system of atoms or molecules.
Entropy Media
Rudolf Clausius (1822–1888), originator of the concept of entropy
During steady-state continuous operation, an entropy balance applied to an open system accounts for system entropy changes related to heat flow and mass flow across the system boundary.
Slow motion video of a glass cup smashing on a concrete floor. In the very short time period of the breaking process, the entropy of the mass making up the glass cup rises sharply, as the matter and energy of the glass disperse.