Skip to content
Snippets Groups Projects
Commit 2a7eb155 authored by Franck Dary's avatar Franck Dary
Browse files

Corrected error in computing of entropy (changed dot product to hadamard product)

parent dd21899d
No related branches found
No related tags found
No related merge requests found
...@@ -7,9 +7,9 @@ float NeuralNetworkImpl::entropy(torch::Tensor probabilities) ...@@ -7,9 +7,9 @@ float NeuralNetworkImpl::entropy(torch::Tensor probabilities)
if (probabilities.dim() != 1) if (probabilities.dim() != 1)
util::myThrow("Invalid probabilities tensor"); util::myThrow("Invalid probabilities tensor");
probabilities = probabilities.unsqueeze(0); probabilities = torch::clamp(probabilities.unsqueeze(0), 0.00000000001, 1.0);
auto logProbs = torch::clamp(torch::log(torch::transpose(probabilities, 0, 1)), -10.0, 10.0); float entropy = torch::sum(probabilities * torch::log(probabilities)).item<float>();
logProbs.index({torch::isnan(logProbs)}) = 0.0;
return - torch::tensordot(probabilities, logProbs, {0,1}, {0,1}).item<float>(); return entropy;
} }
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment