Mutual Information
The mutual information between two discrete random variables X and Y is defined to be
bits. Additional properties are
I(X;Y) = I(Y;X)
(2)
I(X;Y) >= 0,
(3)
and
| I(X;Y)=H(X)+H(Y)-H(X,Y), |
(4)
|
where H(X) is the entropy of the random variable X and H(X,Y) is the joint entropy of these variables.
See also
EntropyThis entry contributed by Erik G. Miller
Explore with Wolfram|Alpha
WolframAlpha
More things to try:
References
Cover, T. M. and Thomas, J. A. Elements of Information Theory. New York: Wiley, pp. 18-26, 1991.Referenced on Wolfram|Alpha
Mutual InformationCite this as:
Miller, Erik G. "Mutual Information." From MathWorld--A Wolfram Resource, created by Eric W. Weisstein. https://mathworld.wolfram.com/MutualInformation.html