PRML §3.3.1 Reproduce the convergence diagram of the parameter distribution by Bayesian linear regression

If you have PRML and color plates that are being promoted at in-house study sessions, you will want to reproduce them seriously. This time, we reproduced the posterior probability of the parameter distribution in 3.3.1. This figure is

  1. Parameter likelihood based on the most recent observation data
  2. Parameter pre / posterior probability distribution
  3. Fitting results with parameters obtained from pre / posterior probability distributions

It is interesting to see how the posterior probabilities converge.

Premise

--Consider a one-dimensional input variable $ x $ and a one-dimensional target variable $ t $ -Fit to a linear model of $ y (x, w) = w_0 + w_1x $ --Model parameters The two distributions are bivariate Gaussian distributions. --Noise is Gaussian noise

Required calculation parts

Creation of design matrix $ \ Phi $ from observation data

def design_matrix(x):
    return np.array([[1, xi] for xi in x])

Function to find the average $ m_N = \ beta S_N \ Phi ^ {T} t $ after observing the data N times

def calc_mn(alpha, beta, x, t):
    Phi = design_matrix(x)
    Sn = calc_Sn(alpha, beta, x)
    return beta * Sn.dot(Phi.T).dot(t)

Function to find the covariance $ S_N = (\ alpha I + \ beta \ Phi ^ {T} \ Phi) ^ {-1} $ after observing the data N times

I = np.identity(2)

def calc_Sn(alpha, beta, x):
    Phi = design_matrix(x)
    return np.linalg.inv(alpha*I + beta*Phi.T.dot(Phi))

Likelihood of observed data

Since the observed data has Gaussian noise with accuracy $ \ beta $, it is a log-likelihood function.

logL(w) = -\frac{\beta}{2}(t-w^{T}\phi(x))^2 + cons

To use. Find $ w $ in the range you want to plot with the log-likelihood function.

def calc_likelifood(beta, t, x, w):
    """
Find the log-likelihood of one observation
    """
    w = np.array(w)
    phi_x = np.array([1, x])
    return -1 * beta / 2 * (t - w.T.dot(phi_x))**2

def plot_likelifood(beta, t, x, title='', ax=None):
    """
Observed likelihood plot
    """
    w0 = np.linspace(-1, 1, 100)
    w1 = np.linspace(-1, 1, 100)
    W0,W1 = np.meshgrid(w0, w1)
    L = []
    for w0i in w0:
        L.append([calc_likelifood(beta, t, x, [w0i, w1i]) for w1i in w1])

    ax.pcolor(W0, W1, np.array(L).T, cmap=plt.cm.jet, vmax=0, vmin=-1)
    ax.set_xlabel('$w_0$')
    ax.set_ylabel('$w_1$')
    ax.set_title(title)

Posterior probability

The bivariate normal distribution with $ m_N $ and $ S_N $ as parameters obtained by the above formula is the posterior probability distribution of $ w $, and the prior probability distribution for $ m_0 $ and $ S_0 $.

def plot_probability(mean, cov, title='', ax=None):
    """
Probability distribution(Bivariate gauss)Plot
    """
    w0 = np.linspace(-1, 1, 100)
    w1 = np.linspace(-1, 1, 100)
    W0,W1 = np.meshgrid(w0, w1)
    P = []
    for w0i in w0:
        P.append([scipy.stats.multivariate_normal.pdf([w0i,w1i], mean, cov) for w1i in w1])
        
    ax.pcolor(W0, W1, np.array(P).T, cmap=plt.cm.jet)
    ax.set_xlabel('$w_0$')
    ax.set_ylabel('$w_1$')
    ax.set_title(title)

Data space

From the bivariate normal distribution with $ m_N $ and $ S_N $ as parameters obtained by the above formula, $ w $ is sampled 6 times and plotted.

result

Before data sample index.png

After one data sample index2.png

After 8 data samples index3.png

It's done. All the code is also posted on github. https://github.com/hagino3000/public-ipynb/blob/master/PRML/PRML%203.3.ipynb

Recommended Posts

PRML §3.3.1 Reproduce the convergence diagram of the parameter distribution by Bayesian linear regression
Plate reproduction of Bayesian linear regression (PRML §3.3)
A python implementation of the Bayesian linear regression class
Implemented in Python PRML Chapter 3 Bayesian Linear Regression
PRML Diagram Figure 1.15 Bias in Maximum Likelihood Estimate of Gaussian Distribution