Glossary

Information entropy

In information theory, the average amount of information (surprise) contained in a message, measured in bits: H = −Σpᵢlog₂(pᵢ). Musical information entropy quantifies the predictability of pitch, rhythm, and harmonic sequences; highly predictable music has low entropy, highly random music has high e

Learn More

Related Terms