General information
Lecturer
Prerequisites
The course is an advanced master course and sufficient mathematical maturity is recommended. The course awards 6 credits.
Time and place
Monday 14:00-15:30, PWR 09 - room V 9.22
Thursday 11:30-13:00, PWR 07 - room V 7.41
Content
The course provides an in-depth treatment of classical and modern concepts in convex optimization that are relevant in control, decision making and machine learning problems. The course articulates around the following four topics:
- Fundamentals of convex analysis
- Operator-splitting methods
- Distributed optimization
- Online convex optimization
After an introductory part covering classic and foundational concepts in convex optimization (convex sets and functions; Lagrangian and Fenchel duality; optimality conditions; gradient methods), we will focus on three state-of-the-art topics in convex optimization.
Operator-splitting methods are first-order methods based on monotone operator theory that are particularly suitable to handle non-smooth and large-scale problems (often arising in control and learning applications).
Distributed optimization is a central paradigm for the development of network infrastructures (e.g. smart cities, swarm robotics) where decisions must be taken using only local computation and communication.
Online convex optimization is a paradigm for sequential decision making problems where an agent needs to take decisions by solving a series of optimization problems online, thus requiring real-time capable computations and means to take action in the face of uncertainty.
The emphasis of the course is on methodological aspects such as: design principles behind the algorithms; properties of the methods and mathematical tools required to prove them; understanding of the most important features of state-of-the-art algorithms used in applications; informed selection of the most suitable algorithm starting from the problems properties.
Information
The course is given in English and consists of a mix of lectures (where new material is explained) and tutorials (where the material presented in the lectures is applied through exercises).
The course features 5 graded homeworks. They are optional but are highly recommended because they are very useful to prepare for the exam and, if done sufficiently correctly, they give a bonus point for the final grade.
Literature
- S. Boyd and L. Vandenberghe. Convex Optimization. Cambridge University Press, 2004.
- J.-B. Hiriart-Urruty and C. Lemaréchal. Fundamentals of Convex Analysis. Springer, Berlin, 2001.
- J. Nocedal and S. J. Wright. Numerical Optimization. Springer, New York, 2006.
- H. H. Bauschke and P. L. Combettes. Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, New York, 2011.
- A. Beck. First-Order Methods in Optimization, SIAM, 2017.
- G. Notarstefano, I. Notarnicola, A. Camisa. Distributed Optimization for Smart Cyber-Physical Networks, Foundations and Trends in Systems and Control, 2019.
- E. Hazan. Introduction to Online Convex Optimization, Foundations and Trends in Optimization, 2016.
Exam
The exam is a written "open-book exam" (i.e., all non-electronic resources are permitted) and will last 120 minutes.
Andrea Iannelli
Prof. Dr.Trustworthy Autonomy for Smart Adaptive Systems

Sebastian Schlor
M.Sc.Research Assistant
[Image: Wolfram Scheible]