Topics for projects - Markus Grasmair
This site is currently under construction. I will publish here proposals for possible theses in due time.
I am mainly working in the theory of inverse problems, with a particular view into applications in mathematical image processing. Below you can find some concrete topics for a master projects. Feel free to come with your own proposals, though. In any case, if you are potentially interested in a thesis in inverse problems, please take contact with me, so that we can arrange a meeting.
Hvis du er interessert i en bacheloroppgave, send meg en e-post, så vi kan avtale et møte og diskutere mulige tema.
Inverse Problems
Inverse problems are typically concerned with the solution of operator equations (usually integral or differential operators) where the solution is extremely sensitive with respect to noise (data and/or modeling errors). A classical example is the inversion of the Radon transform, which is the basis of computerised tomography (CT). Another example is deblurring, which is required for obtaining clear images both for imaging at the largest scales in astronomy and the smallest scales in microscopy. Another class of examples is concerned with parameter identification problems for PDEs where one wants to reconstruct some parameters (e.g. the heat source or a spatially varying conductivity) from the solution of a PDE.
Abstractly, an inverse problem can be formulated as the problem of solving an equation \(F(u) = v^\delta\) for \(u\), given some noisy measurement data \(v^\delta\) with noise level \(\delta\). Here \(F \colon U \to V\) is a possibly non-linear mapping between the Hilbert or Banach spaces \(U\) and \(V\). Because of the ill-posedness of the problem (that is, discontinuous dependence of the solution on the data \(v^\delta\)), a direct solution does not make sense. Instead, it is necessary to introduce some type of regularisation in the solution process that is based on prior knowledge of qualitative properties of the true solution \(u^\dagger\) of the problem.
Source Conditions for Tikhonov Regularisation
A possible approach for the stable solution of inverse problems is (generalised) Tikhonov regularisation, which consists in the minimisation of a functional of the form \[\mathcal{T}(u) = \frac{1}{2} \lVert Fu - v^\delta \rVert^2 + \alpha\mathcal{R}(u).\] Here, the convex regularisation term \(\mathcal{R}\) encodes a-priori knowledge about the true solution in the sense that \(\mathcal{R}(u)\) is small for likely solutions of the problem, but large or even infinite for unlikely ones. Moreover, the regularisation parameter \(\alpha > 0\) balances the size of the residual with the regularisation term. It is then possible to show that, under quite general conditions, the solution \(u_\alpha^\delta\) of this optimisation problem is an approximation of the true solution \(u^\dagger\) of the inverse problem.
Now the question arises, whether it is possible to obtain estimates of the size of the approximation error. With the exception of trivial situations, this is only possible, if the true solution \(u^\dagger\) satisfies additional conditions. Typically, these are formulated as "source conditions", related to KKT conditions for the constrained problem \[\min_{u \in U} \mathcal{R}(u) \qquad\text{ s.t. } Fu = Fu^\dagger\] in some interpolation space between \(U\) and \(V\). However, recent results show that some (weak) error estimates are also possible in the "oversmoothing case," where the prior assumption that \(\mathcal{R}(u^\dagger)\) is finite fails to hold. The goal of this thesis is to look at this situation more closely and to investigate to which extent error estimates still can be derived via an analysis of the (failure of the) KKT conditions.
Prerequisites: Good knowledge of functional analysis: at least the course TMA4230 - Functional Analysis or similar; the course TMA4180 - Optimisation 1; knowledge of convex analysis and/or inverse problem is a distinct advantage, but might, if necessary, also be acquired in a specialisation course (fordypningsemne); knowledge of measure and integration theory (e.g. the course TMA4225 - Foundations of Analysis) is an advantage.