Association·arcadia
Chain rule defines joint entropy as sum of conditional entropies
Claim that the chain rule states the joint entropy of a set of variables equals the sum of their conditional entropies, i.e., H(A1,...,An) = sum H(Ai|previous Ai-1,...,A1)
Confidence
90%
active
Evidence Quote
“the joint entropy of a set of variables is the sum of their conditional entropies”
Relationship
Joint entropy equals chain rule entropy