[opengm] 125/386: bugfix: GradientAccumulator is not supposed to initialize gradient with zero

Ghislain Vaillant ghisvail-guest at moszumanska.debian.org
Wed Aug 31 08:35:50 UTC 2016


This is an automated email from the git hooks/post-receive script.

ghisvail-guest pushed a commit to branch debian/master
in repository opengm.

commit d3b1ab9837b36e1207421bb831a40363a0dc8554
Author: Jan Funke <funke at ini.ch>
Date:   Thu Dec 18 15:40:04 2014 +0100

    bugfix: GradientAccumulator is not supposed to initialize gradient with zero
---
 include/opengm/learning/gradient-accumulator.hxx | 6 +-----
 1 file changed, 1 insertion(+), 5 deletions(-)

diff --git a/include/opengm/learning/gradient-accumulator.hxx b/include/opengm/learning/gradient-accumulator.hxx
index 746718c..92f4c0f 100644
--- a/include/opengm/learning/gradient-accumulator.hxx
+++ b/include/opengm/learning/gradient-accumulator.hxx
@@ -40,11 +40,7 @@ public:
 	GradientAccumulator(ModelWeights& gradient, const ConfigurationType& configuration, Mode mode = Add) :
 		_gradient(gradient),
 		_configuration(configuration),
-		_mode(mode) {
-
-		for (size_t i = 0; i < gradient.numberOfWeights(); i++)
-			gradient[i] = 0;
-	}
+		_mode(mode) {}
 
 	template <typename Iterator, typename FunctionType>
 	void operator()(Iterator begin, Iterator end, const FunctionType& function) {

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/debian-science/packages/opengm.git



More information about the debian-science-commits mailing list