Entropy
Entropy is a quantitative state quantity defined in thermodynamics and statistical mechanics. Initially in thermodynamics, it was introduced as an indicator of irreversibility under adiabatic conditions, and later in statistical mechanics, meaning as physical quantity representing microscopic "messiness" of the system was made. Furthermore, it was pointed out that there is a relation to the information obtained from the system, and it has been applied to information theory as well. Some argue that entropy in physics rather than ET Jaynes, a physicist, should be regarded as one application of information theory. Entropy has dimensions of energy divided by temperature, and unit in SI is J / K. There is a heat capacity as an amount having the same dimension as entropy. Entropy is generally expressed using the symbol S. The Boltzmann formula in statistical mechanics is well known. Where W is the number of states that the system can take under the defined energy (and quantity, volume, etc.). Also, the proportional coefficient k is called the Boltzmann constant. ...