Skip to content

Design Principles

GPJax is designed to be a Gaussian process package that provides an accurate representation of the underlying maths. Variable names are chosen to closely match the notation in (Rasmussen and Williams, 2006)1. We here list the notation used in GPJax with its corresponding mathematical quantity.

Gaussian process notation

On paper GPJax code Description
nn n Number of train inputs
x=(x1,…,xn)\boldsymbol{x} = (x_1,\dotsc,x_{n}) x Train inputs
y=(y1,…,yn)\boldsymbol{y} = (y_1,\dotsc,y_{n}) y Train labels
t\boldsymbol{t} t Test inputs
f(β‹…)f(\cdot) f Latent function modelled as a GP
f(x)f({\boldsymbol{x}}) fx Latent function at inputs x\boldsymbol{x}
ΞΌx\boldsymbol{\mu}_{\boldsymbol{x}} mux Prior mean at inputs x\boldsymbol{x}
Kxx\mathbf{K}_{\boldsymbol{x}\boldsymbol{x}} Kxx Kernel Gram matrix at inputs x\boldsymbol{x}
Lx\mathbf{L}_{\boldsymbol{x}} Lx Lower Cholesky decomposition of Kxx\boldsymbol{K}_{\boldsymbol{x}\boldsymbol{x}}
Ktx\mathbf{K}_{\boldsymbol{t}\boldsymbol{x}} Ktx Cross-covariance between inputs t\boldsymbol{t} and x\boldsymbol{x}

Sparse Gaussian process notation

On paper GPJax code Description
mm m Number of inducing inputs
z=(z1,…,zm)\boldsymbol{z} = (z_1,\dotsc,z_{m}) z Inducing inputs
u=(u1,…,um)\boldsymbol{u} = (u_1,\dotsc,u_{m}) u Inducing outputs

Package style

Prior to building GPJax, the developers of GPJax have benefited greatly from the GPFlow and GPyTorch packages. As such, many of the design principles in GPJax are inspired by the excellent precursory packages. Documentation designs have been greatly inspired by the exceptional Flax docs.


  1. Rasmussen, C. E. and Williams, C. K. (2006) Gaussian Processes for Machine Learning. 3. MIT press Cambridge, MA.