Factor·arcadia

Joint entropy of multiple phenotypes

The joint entropy (uncertainty) of a set of phenotype variables, representing the amount of information required to encode all phenotypes together; always less than or equal to the sum of marginal entropies, with equality if phenotypes are independent.

Confidence
80%
active

Source

Can information theory improve genetic analysis of complex traits?