Warning! The directory is not yet complete and will be amended until the beginning of the term.
390043 UK VGSCO Course (2021S)
Optimization Methods for Data Science
Continuous assessment of course work
Labels
Registration/Deregistration
Note: The time of your registration within the registration period has no effect on the allocation of places (no first come, first served).
- Registration is open from Mo 05.04.2021 00:00 to Fr 16.04.2021 23:59
- Deregistration possible until Su 18.04.2021 23:59
Details
Language: English
Lecturers
Classes
Block, April 19-30, 2021
online
22 April 11.15 - 13.30
23 April 9.30 - 11.45
23 April 14.30 - 16.45
26 April 14.30 - 16.45
28 April 9.30 - 11.45
28 April 14.30 - 16.45
29 April 9.30 - 11.45
30 April 9.30 - 11.45
30 April 14.30 - 16.45
Information
Aims, contents and method of the course
Assessment and permitted materials
Homeworks and/or Seminar
Minimum requirements and assessment criteria
A basic knowledge of linear algebra, calculus and probability theory.
Examination topics
Reading list
Beck, Amir. First-order methods in optimization. Society for Industrial and Applied Mathematics, 2017.Bertsekas, Dimitri P., and Athena Scientific. Convex optimization algorithms. Belmont: Athena Scientific, 2015.Nesterov, Yurii. Introductory lectures on convex optimization: A basic course. Vol. 87. Springer Science & Business Media, 2003.
Association in the course directory
Last modified: Th 08.04.2021 10:09
that, thanks to the advent of the "Big Data era", have re-gained popularity in the last few years.
We first review a bunch of classic methods in the context of modern real-world applications. Then, we discuss
both theoretical and computational aspects of some variants of those classic methods. Finally, we examine current challenges and future research perspectives. Our presentation, strongly influenced by Nesterov’s seminal book, includes the analysis of first-order methods, stochastic optimization methods, randomized and distributed methods, projection-free methods. The theoretical tools considered in the analysis, together with the broad applicability of those methods, makes the course quite interdisciplinary and might be useful for PhD students in different
areas (like, e.g., Analysis, Numerical Analysis, Operations Research, Probability and Mathematical Statistics).1. Methods for Unconstrained Optimization:
1.1 Gradient and accelerated gradient methods
1.3 Block-Coordinate approaches
1.4 Stochastic Gradient and its variants
1.5 Real-world Problems2. Methods for Constrained Optimization, Projection-based and Projection-free Approaches:
2.1 Projected Gradient
2.2 Frank-Wolfe Method and its Variants
2.3 Real-world Problems3. Challenges and Future Research