GPflow Learning Notes

Official documents: https://gpflow.readthedocs.io/en/master/intro.html

GPflow focusses on variational inference and MCMC,
there is no expectation propagation or Laplace approximation.

SVGP paper

Mark van der Wilk, Vincent Dutordoir, S. T. John, Artem Artemev, Vincent Adam, and James Hensman, “A Framework for Interdomain and Multioutput Gaussian Processes”, arXiv:2003.01115 [cs, stat], Mar. 2020. [Link].

5 Software framework

5.3 Implementation of SVGP

ELBO:

L=n=1NEq(f(xn))[logp(ynf(xn))]KL[q(u)p(u)](56)\mathfrak{L} = \sum_{n=1}^{N} \mathbb{E}_{q(\bm{f}(\mathbf{x}_n))} [\log p(\mathbf{y}_n | \bm{f}(\mathbf{x}_n))] - \text{KL}[q(\mathbf{u}) \| p(\mathbf{u})] \tag{56}

7 Uncertain inputs

… GP-LVM, where we are interested in learning the joint posterior of two variables: the distribution of the input locations {qn(xn)}Tn=1N\{q_n(\mathbf{x}_n)\}T_{n=1}^N of the GP function and the GP mapping f()f(\cdot) itself.