Mutual information (MI) (also known as the information gain) of two disjoint sets of random variables is a measure of the mutual dependence between the two groups of variables. More specifically, it quantifies the "amount of information" obtained about one set of variables through observing the other set of variables.

MI(x, K, L, base = 2)

Arguments

x

probability distribution \(\pi\)

K

set of variables

L

set of variables (disjoint with K)

base

base of the logarithm

Value

numeric

Details

The higher the value, the stronger dependence exists between two disjoint sts of variables.

Examples

data(Pi) MI(Pi, K = "A", L = "B")
#> [1] 0.005802149