Bayes fitting

Verification of Bayesian linear regression accuracy

Introduction

I studied Bayesian linear regression at PRML.

Ordinary linear regression overfits with complex parameters

It is difficult to adjust the parameters if you include the regularization term ...

So Bayesian linear regression! !!

It seems that even if the model is complicated, it will not be overfitted ...

Really? ??

Experiment there! !!

Experimental conditions

Ideal curve (green): sin function, sampling with noise added

Bayesian linear regression (red): alpha = 0.005, beta = 10.0

Linear regression (blue)

Linear regression + L2 regularization: λ = 0.001

There are two patterns of basis functions [1, x, x ^ 2, x ^ 3] and [1, x, x ^ 2,…, x ^ 20](M = 3 and M = 20).

The code is given below. https://github.com/kenchin110100/machine_learning/blob/master/sampleBAYES.py

result

Basis set M = 3

First, with basis function M = 3 and data sample number 10 m3sample10.png

Next, with 100 samples m3sample100.png

If the basis function M = 3, there is not much difference ...

Basis set M = 20

If you make the model complicated by setting the basis function M = 20, ...

First of all, the number of samples is 10 m20sample10.png

Next, the number of samples is 100 m20sample100.png

Oh ~~ (Regularization is also quite ...)

Conclusion

Certainly the difference is obvious when applied to complex models! !!

Considering that you have to devise parameters for L2 regularization,

Mr. Bayes.

Recommended Posts

Bayes fitting
Bayes update
Gaussian fitting