Kullback–Leibler divergence (Q255166)

From Wikidata
Jump to navigation Jump to search
measurement of how one probability distribution is different from a second, reference probability distribution
  • information divergence
  • information gain
  • relative entropy
  • KL divergence
  • KLIC
edit
Language Label Description Also known as
English
Kullback–Leibler divergence
measurement of how one probability distribution is different from a second, reference probability distribution
  • information divergence
  • information gain
  • relative entropy
  • KL divergence
  • KLIC

Statements

0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references

Identifiers