Topics for projects - Ronny Bergmann

My research focuses on optimization on Riemannian manifolds, corresponding algorithms and their application to problems arising in data processing. The data under consideration is usually nonlinear or has certain constraints, for example a signal of angles yields nonlinear data or a gps signal of a flying air plane yields a signal on the sphere. In imaging such data also appears for example in the area of diffusion tensor MRI (DT MRI), which can be represented by images or volumetric data, where at each point of the measurement we obtain a symmetric positive definite matrix. These matrices also form such a manifold.

This manifold-valued data still suffers from the same problems as in classical imaging, i.e. we might have to fill in missing data (so called inpainting), reduce noise in the data (denoising). We obtain a (high dimensional) optimization on the power manifold \(\mathcal N = \mathcal M^{n,m}\), where \(n\) and \(m\) are the width and height of an image, where each pixel is a value on the Riemannian manifold \(\mathcal M\). The power manifold itself is also a manifold. With a variational model, we obtain an optimization problem on \(\mathcal N\).

Examples of manifolds are the Sphere \(\mathbb S^2\) in \(\mathbb R^3\), the set of rotation matrices (with applications in mechanics), orientations at certain places in space (Special Euclidean group) the [Stiefel manifold](, and many more

In the following I would like to present a few possible areas and topics. Most of these topics can include programming, where my personal favorite is Julia, using Manopt.jl and Manifolds.jl, but also Python (using pymanopt) or Matlab (using manopt) are languages that can be used.

Optimization algorithms for nonsmooth optimization

Classical variational models yield a function \(f\colon \mathcal M \to \mathbb R\) which is usually nonsmooth, but for the case of image processing it is highly structured. For such a function we aim to solve \[ \operatorname*{argmin}_{x\in\mathcal M} f(x) \] Recently one widely used algorithm in imaging, the Chambolle-Pock algorithm was generalized to manifolds.

A possible topic would be to investigate and test this algorithm and apply the algorithm using the ROF model to solve denoising problems of real-valued data, for example phase-valued data or images from DT MRI.

A second project in this area is to investigate the half-quadratic minimization as an approach to solve these problems. Here, especially an efficient implementation and its benchmarking would be a focus of such a topic.

But we could also consider applications of specific examples from other areas and look for a nice algorithm.

Manifold-Valued Image Processing

Within manifold-valued imaging, we are given a possibly noisy or lossy manifold-valued image. Let \(\mathcal G = \{1,\ldots,n \} \times \{1,\ldots,m\}\)denote the pixel grid of an image of size \(n\) by \(m\) pixel. For a manifold-valued image each pixel \(p_{i,j}\in\mathcal M\) is a value on some manifold. Note that this includes classical images as special cases \(\mathcal M = \mathbb R\) (gray valued images) or \(\mathcal M = \mathbb R^3\) (color images) but especially also other color spaces like HSV, where the hue channel is a value on the circle. Then classical tasks from image processing also apply here, i.e., inpainting of missing data or denoising.

Potential topics include deblurring, for which the first question arises, what blur actually is, investigation of algorithms and their application to real-life data, for example from DT MRI or EBSD.

Nonlinear Data in Machine Learning

The images from the last section can also be used to learn certain machine learning models, since the learning procedure is also closely related to optimization. Then, the most important concepts to discuss are equivariance and invariance of a network layer and we could consider projects / theses in this area as well.

Constrained Optimisation on manifolds

Additionally to an optimization problem being posed on a Riemannian manifold, we might have constraints, i.e., we have three functions on the manifold \(f\colon\mathcal M \to \mathbb R\), \(g\colon \mathcal M \to \mathbb R^m\), and \(h\colon \mathcal M \to \mathbb R ^n\) and the optimization problem reads \[ \begin{split} \text{Minimize } \quad &f(x) \text{ over } x \in\mathcal M\\ \text{ such that } \quad &g(x) \leq 0,\\ \text{ and } \quad &h(x) = 0, \end{split} \] where the less or equal and the equality are meant component-wise.


Possible topics range from investigating the problem in theory as well as looking for algorithms and their convergence and implementation.

Prerequisites TMA4180 Optimisation I

Some knowledge on Differential Geometry and Riemannian manifolds would be good but is not necessary. We could for example do the Specialisation Course as a reading course towards Differential Geometry and Optimisation.

If you are interested in any of the topics or would like to discuss more precise ideas of topics, please send me an email. We can also discuss ideas that you have, or preferences towards implementation or theory, and taylor a topic towards that

2023-12-13, Ronny Bergmann