Glossary

Entropy

A measure of disorder or uncertainty in a system; in information theory, a measure of the average information content of a message; central to understanding both physical and informational systems. *First introduced: Ch. 6*

Learn More

Related Terms