Professor Marc Deisenroth is the Google DeepMind Chair of Machine Learning and Artificial Intelligence at University College London and part of the UNESCO Chair on Artificial Intelligence at UCL. He also holds a visiting faculty position at the University of Johannesburg. Marc co-leads the Sustainability and Machine Learning Group at UCL. His research interests center around data-efficient machine learning, probabilistic modeling and autonomous decision making with applications in climate/weather science, nuclear fusion, and robotics.
Marc was Program Chair of EWRL 2012, Workshops Chair of RSS 2013, EXPO Chair at ICML 2020, Tutorials Chair at NeurIPS 2021, and Program Chair at ICLR 2022. He is an elected member of the ICML Board and serves on the Scientific Advisory Boards of the National Oceanography Centre as well as the United Nations University Global AI Network. He received Paper Awards at ICRA 2014, ICCAS 2016, ICML 2020, AISTATS 2021, and FAccT 2023. In 2019, Marc co-organized the Machine Learning Summer School in London.
In 2018, Marc received The President’s Award for Outstanding Early Career Researcher at Imperial College. He is a recipient of a Google Faculty Research Award and a Microsoft PhD Grant.
In 2018, Marc spent four months at the African Institute for Mathematical Sciences (Rwanda), where he taught a course on Foundations of Machine Learning as part of the African Masters in Machine Intelligence. He is co-author of the book Mathematics for Machine Learning, published by Cambridge University Press.
Machine Learning: Data-efficient machine learning, Gaussian processes, reinforcement learning, Bayesian optimization, approximate inference, deep probabilistic models, geo-spatial models
Environment and Sustainability: Data assimilation, data-driven forecasting models, renewables
Robotics and Control: Robot learning, legged locomotion, planning under uncertainty, imitation learning, adaptive control, robust control, learning control, optimal control
Signal Processing: Nonlinear state estimation, Kalman filtering, time-series modeling, dynamical systems, system identification, stochastic information processing
In recent years, machine learning has established itself as a powerful tool for high-resolution weather forecasting. While most current machine learning models focus on deterministic forecasts, accurately capturing the uncertainty in the chaotic weather system calls for probabilistic modeling. We propose a probabilistic weather forecasting model called Graph-EFM, combining a flexible latent-variable formulation with the successful graph-based forecasting framework. The use of a hierarchical graph construction allows for efficient sampling of spatially coherent forecasts. Requiring only a single forward pass per time step, Graph-EFM allows for fast generation of arbitrarily large ensembles. We experiment with the model on both global and limited area forecasting. Ensemble forecasts from Graph-EFM achieve equivalent or lower errors than comparable deterministic models, with the added benefit of accurately capturing forecast uncertainty.
Global convolutions have shown increasing promise as powerful general-purpose sequence models. However, training long convolutions is challenging, and kernel parameterizations must be able to learn long-range dependencies without overfitting. This work introduces reparameterized multi-resolution convolutions (MRConv), a novel approach to parameterizing global convolutional kernels for long-sequence modelling. By leveraging multi-resolution convolutions, incorporating structural reparameterization and introducing learnable kernel decay, MRConv learns expressive long-range kernels that perform well across various data modalities. Our experiments demonstrate state-of-the-art performance on the Long Range Arena, Sequential CIFAR, and Speech Commands tasks among convolution models and linear-time transformers. Moreover, we report improved performance on ImageNet classification by replacing 2D convolutions with 1D MRConv layers.
Unlike traditional rigid robots, soft robots offer more flexibility, compliance, and adaptability. They are also typically cheaper to manufacture and are lighter than their rigid counterparts. However, due to modeling difficulties, real-world applications for soft robots are still limited. This is especially true for applications that would require dynamic or fast motion. In addition, their operating principles and compliance make integrating effective proprioceptive sensors difficult. As such, state estimation and predictions of how the state evolves in time are challenging modeling tasks. Large-scale ($≈$ two meters in length), particularly fluid-driven, soft robots have greater modeling complexity due to increased inertia and related effects of gravity. Few approaches to soft robot control (learned or model-based) have enabled dynamic motion such as throwing or hammering since most methods require limiting assumptions about the kinematics, dynamics, or actuation models to make the control problem tractable or performant. To address this issue, we propose using Bayesian optimization to learn policies for dynamic tasks on a large-scale soft robot. This approach optimizes the task objective function directly from commanded pressures, without requiring approximate kinematics or dynamics as an intermediate step. We also present simulated and real-world experiments to illustrate the efficacy of the proposed approach.
In this work, we present a new open-source Python programming library for performing efficient interpolation of non-stationary satellite altimetry data, using scalable Gaussian Process (GP) techniques. We showcase the library, GPSat, by using data from the CryoSat-2, Sentinel-3A, and Sentinel-3B radar altimeters, to generate complete maps of daily 50 km$^2$-gridded Arctic sea ice radar freeboard. Relative to a previous GP interpolation scheme, we find that GPSat offers a 504$times$ computational speedup, with less than 4 mm difference on the derived freeboards, on average. We then demonstrate the scalability of GPSat through freeboard interpolation at 5 km$^2$ grid resolution, and Sea-Level Anomalies (SLA) at the resolution of the altimeter footprint. Validation of this novel high resolution radar freeboard product shows strong agreement with airborne data, with a linear correlation of 0.66. Footprint-level SLA interpolation also shows improvements in predictive skill over linear regression, which is a standard approach used in sea ice altimetry data processing. We suggest that GPSat could overcome the computational bottlenecks faced in many altimetry-based interpolation routines. This could in turn lead to improved observational estimates of ocean topography and sea ice thickness, and also further critical understanding of ocean and sea ice variability over short spatio-temporal scales.