Posts tagged gaussian processes
Gaussian Processes: HSGP Advanced Usage
- 28 June 2024
The Hilbert Space Gaussian processes approximation is a low-rank GP approximation that is particularly well-suited to usage in probabilistic programming languages like PyMC. It approximates the GP using a pre-computed and fixed set of basis functions that don’t depend on the form of the covariance kernel or its hyperparameters. It’s a parametric approximation, so prediction in PyMC can be done as one would with a linear model via pm.Data
or pm.set_data
. You don’t need to define the .conditional
distribution that non-parameteric GPs rely on. This makes it much easier to integrate an HSGP, instead of a GP, into your existing PyMC model. Additionally, unlike many other GP approximations, HSGPs can be used anywhere within a model and with any likelihood function.
Gaussian Processes: HSGP Reference & First Steps
- 10 June 2024
The Hilbert Space Gaussian processes approximation is a low-rank GP approximation that is particularly well-suited to usage in probabilistic programming languages like PyMC. It approximates the GP using a pre-computed and fixed set of basis functions that don’t depend on the form of the covariance kernel or its hyperparameters. It’s a parametric approximation, so prediction in PyMC can be done as one would with a linear model via pm.Data
or pm.set_data
. You don’t need to define the .conditional
distribution that non-parameteric GPs rely on. This makes it much easier to integrate an HSGP, instead of a GP, into your existing PyMC model. Additionally, unlike many other GP approximations, HSGPs can be used anywhere within a model and with any likelihood function.
Baby Births Modelling with HSGPs
- 16 January 2024
This notebook provides an example of using the Hilbert Space Gaussian Process (HSGP) technique, introduced in [Solin and Särkkä, 2020], in the context of time series modeling. This technique has proven successful in speeding up models with Gaussian process components.
Gaussian Processes: Latent Variable Implementation
- 06 June 2023
The gp.Latent
class is a direct implementation of a Gaussian process without approximation. Given a mean and covariance function, we can place a prior on the function \(f(x)\),
Marginal Likelihood Implementation
- 04 June 2023
The gp.Marginal
class implements the more common case of GP regression: the observed data are the sum of a GP and Gaussian noise. gp.Marginal
has a marginal_likelihood
method, a conditional
method, and a predict
method. Given a mean and covariance function, the function \(f(x)\) is modeled as,