Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge-Equivariant Projected Kernels

Abstract

Gaussian processes are commonly used machine learning models, capable of learning unknown functions in a way that represents uncertainty, thereby facilitating construction of optimal decision-making systems. Motivated by a desire to deploy Gaussian processes in novel areas of science, a rapidly-growing line of work has focused on constructively extending these models to handle non-Euclidean domains, including data defined on Riemannian manifolds, such as spheres and tori. In this work, we propose techniques that extend this class to model vector fields, in addition to scalar-valued functions, which are important in a number of application areas in the physical sciences. To do so, we formulate Gaussian processes within the differential-geometric language of random sections and propose a general recipe for constructing locally-matrix-valued kernels, which induce Gaussian vector fields, from scalar-valued Riemannian kernels. We extend standard Gaussian process training methods, such as variational inference, to this setting. This enables vector-valued Gaussian processes on Riemannian manifolds to be trained using standard methods and makes them accessible to machine learning practitioners.

Publication
Advances in Neural Information Processing Systems (NeurIPS)