Computes Shannon entropy of the probability distribution. The entropy quantifies the expected value of the information contained in a probability distribution.

entropy(x, base = 2)

Arguments

x

probability distribution

base

base of the logarithm (default = 2)

Value

numeric

Examples

data(Pi) Entropy(Pi)
#> Error in Entropy(Pi): could not find function "Entropy"