Association·arcadia
Mutual information equals KL divergence of joint vs independent distributions
Claim that mutual information between two variables equals the KL divergence between their joint distribution and the product of their marginals.
Confidence
90%
active
Evidence Quote
“the mutual information between A and B is the information lost by assuming that A and B are distributed independently”
Relationship
Mutual information equals Kullback-Leibler divergence (relative entropy)