Examveda

What is the primary goal of the "Kullback-Leibler (KL) divergence" in information theory and statistics?

A. To measure the difference between two probability distributions

B. To calculate the mean squared error

C. To determine the sample size

D. To perform data imputation

Answer: Option A


This Question Belongs to Data Science >> Data Science Miscellaneous

Join The Discussion

Related Questions on Data Science Miscellaneous