Path: blob/master/notebooks/book2/18/gp_kernel_opt.ipynb
1193 views
GP kernel parameter optimization/ inference
Slightly modified from
https://tinygp.readthedocs.io/en/latest/tutorials/modeling.html
Decouple train and test parts of the models
Explicitely use
x
andy
variables by passing them as function arguments rather than using them globally.
Data
Optimizing hyper-parameters using flax & optax
We find the maximum (marginal) likelihood hyperparameters for the GP model.
To set up our model, we define a custom linen.Module
, and optimize it's parameters as follows:
Our Module
defined above also returns the conditional predictions, that we can compare to the true model:
Inferring hyper-parameters using NUTS in numpyro
We can compute a posterior over the kernel parameters, and hence the posterior predictive over the mean function, using NUTS.
Let's examine the posterior. For that task, let's use ArviZ
:
Now, let us sample posterior means for the test data
And, finally we can plot our posterior inferences of the comditional process, compared to the true model:
Inferring hyper-parameters using NUTS in BlackJAX
Inspired by a blackjax documentation example.