Skip to main content

entropy

Definition

Unpredictable information. Often used as a secret or as input to a key generation algorithm.

More on Wikipedia

Entropy

The term entropy is also used to describe the degree of unpredictability of a message. Entropy is then measured in bits. The degree or strength of randomness determines how difficult it would be for someone else to reproduce the same large random number. This is called collision resistance.