For calculate_kl, I am assuming mu_q refers to the variational posterior and mu_p refers to the prior.
|
def calculate_kl(mu_q, sig_q, mu_p, sig_p): |
|
kl = 0.5 * (2 * torch.log(sig_p / sig_q) - 1 + (sig_q / sig_p).pow(2) + ((mu_p - mu_q) / sig_p).pow(2)).sum() |
|
return kl |
While in the following codes, prior_mu is input as mu_q.
|
def kl_loss(self): |
|
kl = KL_DIV(self.prior_mu, self.prior_sigma, self.W_mu, self.W_sigma) |
|
if self.use_bias: |
|
kl += KL_DIV(self.prior_mu, self.prior_sigma, self.bias_mu, self.bias_sigma) |
|
return kl |
Could you check this?
For
calculate_kl, I am assumingmu_qrefers to the variational posterior andmu_prefers to the prior.PyTorch-BayesianCNN/metrics.py
Lines 27 to 29 in d93bad5
While in the following codes,
prior_muis input asmu_q.PyTorch-BayesianCNN/layers/BBB_LRT/BBBLinear.py
Lines 75 to 79 in d93bad5
Could you check this?