Factor·arcadia
Joint entropy
A measure of the combined uncertainty of two or more random variables; less than or equal to the sum of their individual entropies, with equality if the variables are independent.
Confidence
90%
active
Source
Can information theory improve genetic analysis of complex traits?