Oops, an error occurred! Code: 202501190454438b312ee6

Foundations of Machine Learning

Neural networks for graph structures

In many application areas of ML data can be represented in form of graphs. Examples include social networks, shape analysis or protein or other biomedical data. It is therefore of utmost importance for these application areas to transfer the overwhelming success of deep networks from image data to graph representations.

Deep learning for time series analysis

Data often arises in the form of time series, including challenges like video analysis, weather forecasting or portfolio optimization. To date, the number of working network approaches for time series analysis is rather limited. We therefore believe that it is of enormous practical value to develop novel DL approaches for time series analysis.

Deep networks, optimal control & global optimization

Optimal control is an important challenge in numerous application areas - an example is the modeling of multiagent interactions. To date, the relationship between deep networks and optimal control has not been extensively explored and we expect that this topic will have important consequences both for optimal control as well as for DL.

Theory of knowledge graphs

This topic includes many real-world phenomena including challenges like semantic video or scene analysis and clinical and gene ontologies. We therefore plan to advance the theory of knowledge graphs, in particular the applicability of DL to this domain.

Deep representation learning

In many real-world situations, the issue of sparsely or unlabeled data can be addressed by either unsupervised or self-supervised learning. We aim to advance latent representation learning by developing methods to robustly include additional priors from either graphs or other covariates, and leverage this for style transfer in our three key application areas.

Novel algorithms for network training

While DL has become increasingly important in endless application areas, to date the dominating strategy for training neural networks are various forms of stochastic gradient descent. We therefore believe that it is of immense value to explore more sophisticated optimization methods that may be more suitable to such non-smooth optimization problems.