Analogously to mutual information, one can compute also conditional mutual information. More precisely, for three disjoint groups of variables, and a corresponding probability distribution one can compute conditional mutual information. The higher the value of conditional mutual information the stronger conditional dependence between the respective group of variables. If the value is zero, then the respective groups of variables are conditionally independent.

conditionalMI(x, cond, base = 2)

Arguments

x

Distribution

K

set of variables

L

set of variables (disjoint with K)

M

set of variables (disjoint with K, L)

base

base of the logarithm

Value

numerical

Examples

data(Pi) conditionalMI(coins, K="X", L="Y", M = "Z")
#> [1] 2