Research Areas

At the crossroads of optimization, geometry, and statistics.

I exploit mathematical structures to design and analyze optimization algorithms, and construct statistical estimators from high-dimensional data.

There are a variety of problems at the foundation of Data Science, for which the best problems require ideas and techniques from multiple mathematical disciplines. Currently, I am particularly interested in using ideas from mathematical optimization, geometry, and statistics. Below are some research areas I am actively exploring.


Riemannian Optimization

The central problem of interest of Riemannian optimization is to do the following task:

\[ \min_{x \in \mathcal{M}} f(x) \]

where \(\mathcal{M}\) is a Riemannian manifold, and \(f: \mathcal{M} \to \mathbb{R}\) is some cost function. The goal of the optimizer is to provide black-box iterative algorithms to solve this problem, with provable guarantees.

The central difficulty of Riemannian optimization is that the manifold \(\mathcal{M}\) are generally nonlinear spaces. Given first-order information of \(f\), i.e. an oracle such that \(x \mapsto (f(x), \nabla_{\mathcal{M}}f(x))\), one does not have access to exact exponential maps — the operation which turns directions into new iterates. Additionally, some schemes gradients from nearby points construct more accurate surrogates of \(f\). In the Riemannian setting, we do not even have access to exact parallel transports — the operation which moves first-order information to a shared space! Even worse, due to the intrinsic curvature of \(\mathcal{M}\), exact parallel transports also incur errors!

My Belief: Mathematical Optimization uses local surrogates of \(f\) to search, Riemannian geometry patches togther local maps to \(\mathcal{M}\) to enable analysis on $, thus classical optimization theory should generalize

Submanifolds of Wasserstein Space

In Construction

Optimization in Wasserstein Space

In Construction