Universität Wien
Achtung! Das Lehrangebot ist noch nicht vollständig und wird bis Semesterbeginn laufend ergänzt.

390036 UK VGSCO Derivate-Free Optimization: Basic Principles and State of the Art (2017W)

Prüfungsimmanente Lehrveranstaltung

An/Abmeldung

Hinweis: Ihr Anmeldezeitpunkt innerhalb der Frist hat keine Auswirkungen auf die Platzvergabe (kein "first come, first served").

Details

max. 50 Teilnehmer*innen
Sprache: Englisch

Lehrende

Termine

Block: 27.11 - 01.12.2017
Room: 3.307 (3rd Floor)

Monday 08.30-13.00
Tuesday 10.30-15.00
Wednesday 15.00 - 19.00
Thursday 08.30-11.00
Friday 08.30-13.00


Information

Ziele, Inhalte und Methode der Lehrveranstaltung

This course deals with the numerical solution of Optimization problems involving functions for which derivatives are unavailable, available at a prohibitive cost, or subject to noise --- as may be the case when function evaluations result from computing simulation or laboratorial observation.

In the field of Derivative-Free Optimization (DFO) one studies the development, analysis, and implementation of algorithms for such type of problems. A successful application of such algorithms requires them to be efficient (in the sense of taking a moderate number of function evaluations) and rigorous (meaning not too far from an algorithmic structure for which one can prove something meaningful under reasonable assumptions).

In this course we will describe the building blocks over which DFO algorithms are based and pass the intuition that helps understanding why are they globally convergent (meaning convergent to stationarity independently of the starting point). In some cases it is possible to develop worst case complexity bounds in terms of the number of iterations or function evaluations to reach some level of stationarity. Such global rates complement existing analyses of global convergence by providing additional insight and comparisons.

We will essentially focus on two well-known classes of methods, direct-search methods and model-based (trust-region type) algorithms. While the former are typically less efficient, they are better tailored to non-smooth and constrained settings due to their directional nature.

We will however try to bridge over other methodologies (like convex and sparse optimization, surrogate modeling, evolutionary optimization, multi-objective optimization, global optimization), as time permits, but always from the point of view of DFO.

Art der Leistungskontrolle und erlaubte Hilfsmittel

A set of 400 hundred slides will be provided to the students.

Further reading will come from papers and from the book:
A. R. Conn, K. Scheinberg, and L. N. Vicente, Introduction to Derivative-Free Optimization, MPS-SIAM Book Series on Optimization, SIAM, Philadelphia, 2009

Mindestanforderungen und Beurteilungsmaßstab

Good knowledge of Vector Calculus, Linear Algebra, and Numerical Analysis.
Basic knowledge of Nonlinear Optimization and Probability Theory.

Prüfungsstoff

Sampling and modeling (positive spanning, polynomial interpolation and regression, surrogate models).
Direct-search methods and non-smooth calculus.
Model-based trust-region methods.
Global convergence, global rates, randomness.
Coverage of other methodologies and software.

Literatur

The following survey paper provides an introduction to the topic:

A. L. Custódio, K. Scheinberg, and L. N. Vicente, Methodologies and software for derivative-free optimization, Chapter 37 of Advances and Trends in Optimization with Engineering Applications, T. Terlaky, M. F. Anjos, and S. Ahmed (editors), MOS-SIAM Book Series on Optimization, SIAM, Philadelphia, 2017

A pdf file is available from my page: http://www.mat.uc.pt/~lnv/

Zuordnung im Vorlesungsverzeichnis

Letzte Änderung: Mo 07.09.2020 15:46