KL Divergence - How to tell how different two distributions are

KL Divergence - How to tell how different two distributions are

11.783 Lượt nghe
KL Divergence - How to tell how different two distributions are
Correction (10:26). The probabilities are wrong. The correct ones are here: For Die 1: 0.4^4 * 0.2^2 * 0.1^1 * 0.1^1 * 0.2^2 For Die 2: 0.4^4 * 0.1^2 * 0.2^1 * 0.2^1 * 0.1^2 For Die 3: 0.1^4 * 0.2^2 * 0.4^1 * 0.2^1 * 0.1^2 Kullback Leibler (KL) divergence is a way to measure how far apart two distributions are. In this video, we learn KL-divergence in a simple way, using a probability game with dice. Shannon entropy and information gain: https://www.youtube.com/watch?v=9r7FIXEAGvs&t=1066s&pp=ygUic2hhbm5vbiBlbnRyb3B5IGluZm9ybWF0aW9uIHRoZW9yeQ%3D%3D Grokking Machine Learning book: www.manning.com/books/grokking-machine-learning 40% discount code: serranoyt