MI.Rd
Mutual information (MI) (also known as the information gain) of two disjoint sets of random variables is a measure of the mutual dependence between the two groups of variables. More specifically, it quantifies the "amount of information" obtained about one set of variables through observing the other set of variables.
MI(x, K, L, base = 2)
x | probability distribution \(\pi\) |
---|---|
K | set of variables |
L | set of variables (disjoint with K) |
base | base of the logarithm |
numeric
The higher the value, the stronger dependence exists between two disjoint sts of variables.
#> [1] 0.005802149