[opengm] 181/386: Embarassingly simple example: one-variable model, with max likelihood learning

Ghislain Vaillant ghisvail-guest at moszumanska.debian.org
Wed Aug 31 08:37:36 UTC 2016


This is an automated email from the git hooks/post-receive script.

ghisvail-guest pushed a commit to branch debian/master
in repository opengm.

commit f25b5d824231cfe81d2fa655d17bfa72df66ff8f
Author: Carsten Haubold <carstenhaubold at googlemail.com>
Date:   Fri Dec 19 16:40:51 2014 +0100

    Embarassingly simple example: one-variable model, with max likelihood learning
---
 fubar/max_likelihood_example.py | 29 +++++++++++++++++++++++++++++
 1 file changed, 29 insertions(+)

diff --git a/fubar/max_likelihood_example.py b/fubar/max_likelihood_example.py
new file mode 100644
index 0000000..3b361ce
--- /dev/null
+++ b/fubar/max_likelihood_example.py
@@ -0,0 +1,29 @@
+import opengm
+import opengm.learning as learning
+from opengm import numpy
+
+# create a simple model with exactly one variable with two labels
+numWeights = 2
+nLabels = 2
+nVars = 1
+
+# set weight ids and features for both labels
+weightIds = numpy.array([[0, 1],       [0, 1]])
+features = numpy.array( [[0.5, -0.25], [-0.5, -1.25]])
+
+# create dataset with 2 weights and get the 2 weights
+dataset = learning.createDataset(numWeights)
+weights = dataset.getWeights()
+
+# set up graphical model
+gm = opengm.gm(numpy.ones(nVars)*nLabels)
+fid = gm.addFunction(learning.lUnaryFunction(weights, 2, features, weightIds))
+gm.addFactor(fid, [0])
+
+# add graphical model to dataset with ground truth
+ground_truth = numpy.array([0]).astype(opengm.label_type)
+dataset.pushBackInstance(gm, ground_truth)
+
+# set up learner and run
+learner = learning.maxLikelihoodLearner(dataset)
+learner.learn(infCls=opengm.inference.TrwsExternal,  parameter=opengm.InfParam())
\ No newline at end of file

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/debian-science/packages/opengm.git



More information about the debian-science-commits mailing list