Maximum posteriori estimation of label series with HMM using OpenGM's python interface.
Preparation
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
import opengm
The data is as follows. Try integerization and smoothing of time series data.
data
d = '17.2 19.7 21.6 21.3 22.1 20.5 16.3 18.4 21.0 16.1 17.5 18.5 18.4 18.3 16.0 21.2 18.8 24.3 23.3 20.5 16.9 22.4 20.1 24.5 24.2 22.7 19.6 23.6 23.3 24.6 25.0 24.3 22.2 22.7 19.5 20.5 17.3 17.2 22.0 20.9 21.5 22.3 24.0 22.4 20.2 15.7 20.4 16.3 17.7 14.3 18.4 16.6 13.9 15.2 14.8 15.0 11.5 13.4 13.5 17.0 15.0 17.5 12.3 11.8 14.5 12.4 12.9 15.8 13.8 11.4 6.5 5.9 7.2 5.6 4.6 7.5 8.9 6.6 3.9 5.7 7.3 6.1 6.8 3.1 2.6 7.9 5.2 2.0 4.0 3.4 5.7 8.1 4.7 5.4 5.9 3.6 2.9 5.7 2.1 1.6 2.3 2.4 1.2 4.2 4.2 2.4 5.6 2.5 3.0 6.1 4.9 7.1 5.0 7.2 5.2 5.1 10.4 8.3 6.9 6.8 7.8 4.2 8.0 3.2 7.9 5.9 9.5 6.4 9.2 11.7 11.6 15.5 16.7'
d = np.array([ float(c) for c in d.split()])
Now, build the HMM model and execute inference.
Run!
nNodes = d.shape[0] #Number of nodes.
nLabels = 20 #Number of discrete classes. 20
variableSpace = np.ones(nNodes)*nLabels #Number of labels for each node. All the same here
gm = opengm.gm(variableSpace)
# unary
for i in range(nNodes):
u = np.array([ abs(d[i] - j) for j in range(nLabels) ]) #Data term. Absolute value of difference from label
f = gm.addFunction(u)
gm.addFactor(f, i)
# pairwise
p = 10 #Cost when the classes of adjacent nodes are different. (0 if same)
pairwise = np.array((np.ma.ones((nLabels,nLabels)) - np.eye(nLabels)) * p) #Pairwise term. 0 for the same label, p for different
f_pw = gm.addFunction(pairwise)
for i in range(nNodes-1):
gm.addFactor(f_pw, [i, i+1]) #Setting the edge that connects the nodes. Since it is an HMM, it is one-dimensional.
inf = opengm.inference.DynamicProgramming(gm=gm) #Inference algorithm: DP is enough because it is one-dimensional
inf.infer() #Inference execution
res = inf.arg() #Collect results
#plot.
plt.plot(d, label="data")
plt.plot(res, label="result")
plt.legend()
Recommended Posts