Factor·arcadia

Kullback-Leibler divergence (relative entropy)

A measure of the difference between two probability distributions, representing the extra information needed to encode a dataset using the wrong distribution. Critical to information theory, it is always positive and zero only if the two distributions are identical.

Confidence
80%
active

Source

Can information theory improve genetic analysis of complex traits?