Multiinformation, sometimes called also dependence tightness or informational content is a Kullback-Leibler divergence of a distribution with respect to the product of its one-dimensional marginals.

IC(x, base = 2)

Arguments

x

probability distribution \(\pi\)

base

base of the logarithm

Value

numeric

Details

$$I(\pi(K)) = \sum_{x\in X}\pi(x)\frac{log(\pi(x))}{\prod_{i\in K}\pi(x_i)}$$

It is nonnegative, finite, and equals 0 if and only if all variables are mutually independent for the distribution \(\pi\)

Examples

data(Pi) multiInformation(Pi)
#> Error in multiInformation(Pi): could not find function "multiInformation"