Mathematical Optimization for Data Analysis

Description: Optimization problems and methods play a central role in data science and machine learning. The scope of this lecture is to introduce the necessary mathematical basics for the use of specific optimization techniques in data analysis. The course is designed for students which are interested in a mathematical approach to this topic. Therefore, also students from mathematics and financial mathematics, but also from other departments (e.g., Department of Computer and Information Science or Department of Physics) are very welcome.

Hint: The present lecture is designed in coordination with Jun.-Prof. Dr. Tobias Sutter (Department of Computer and Information Science of the University of Konstanz). Thus, an additional participation in the lecture Optimization for Data Science is very welcome.

Table of Contents: The lecture contains the following topics

  1. Introduction

  2. Gradient Method Using Momentum

  3. Stochastic Gradient

  4. First-Order Methods for Constrained Optimization

  5. Nonsmooth Functions and Subgradients

  6. Nonsmooth Optimization Methods

  7. Duality and Algorithms

Required knowledges: The lectures is essentially based on the lecture Optimization I. The successful participation in the lecture Optimization II is recommended, but not absolutely necessary. Knowledges in analysis (Analysis I & II), linear algebra (Linear Algebra I & II) and basics in measure theory (e.g., Analysis III) and stochastics (e.g., Stochastic Processes) are expected.

ECTS points: 5 (2h lecture plus 1h exercises per week)

Type of exam: oral or written exam (depending on the number of students)

Literature: The content of the lecture consists partially of the following book: 'Optimization for Data Analysis', Stephan J. Wright and Benjamin Recht; Cambridge University Press, 2022, doi: 10.1017/9781009004282