Function File: mutualinformation (xy)
Computes the mutual information of the given channel transition matrix. By definition we have
I(
x,
y)
given asI(
x:
y) = SUM(P(
x,
y) * log2(p(
x,
y) / p(
x)/p(
y))) = relative_entropy(P(
x,
y) || P(
x),P(
y))
Mutual Information, is amount of information, one variable has, about the other. It is the reduction of uncertainity. This is a symmetric function.See also: entropy conditionalentropy