KL Divergence

KL Divergence

Linear Digressions

Kullback Leibler divergence, or KL divergence, is a measure of information loss when you try to approximate one distribution with another distribution.  It comes to us originally from information theory, but today underp…

Related tracks

See all