License: Creative Commons Attribution 4.0 International license (CC BY 4.0)
When quoting this document, please refer to the following
DOI: 10.4230/DagRep.12.2.87
URN: urn:nbn:de:0030-drops-169325
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2022/16932/
Auger, Anne ;
Fonseca, Carlos M. ;
Friedrich, Tobias ;
Lengler, Johannes ;
Gissler, Armand
Weitere Beteiligte (Hrsg. etc.): Anne Auger and Carlos M. Fonseca and Tobias Friedrich and Johannes Lengler and Armand Gissler
Theory of Randomized Optimization Heuristics (Dagstuhl Seminar 22081)
Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 22081 "Theory of Randomized Optimization Heuristics".
This seminar is part of a biennial seminar series. This year, we focused on connections between classical topics of the community, such as Evolutionary Algorithms and Strategies (EA, ES), Estimation-of-Distribution Algorithms (EDA) and Evolutionary Multi-Objective Optimization (EMO), and related fields like Stochastic Gradient Descent (SGD) and Bayesian Optimization (BO). The mixture proved to be extremely successful. Already the first talk turned into a two hour long, vivid and productive plenary discussion. The seminar was smaller than previous versions (due to corona regulations), but its intensity more than made up for the smaller size.
BibTeX - Entry
@Article{auger_et_al:DagRep.12.2.87,
author = {Auger, Anne and Fonseca, Carlos M. and Friedrich, Tobias and Lengler, Johannes and Gissler, Armand},
title = {{Theory of Randomized Optimization Heuristics (Dagstuhl Seminar 22081)}},
pages = {87--102},
journal = {Dagstuhl Reports},
ISSN = {2192-5283},
year = {2022},
volume = {12},
number = {2},
editor = {Auger, Anne and Fonseca, Carlos M. and Friedrich, Tobias and Lengler, Johannes and Gissler, Armand},
publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/opus/volltexte/2022/16932},
URN = {urn:nbn:de:0030-drops-169325},
doi = {10.4230/DagRep.12.2.87},
annote = {Keywords: black-box optimization, derivative-free optimization, evolutionary and genetic algorithms, randomized search algorithms, stochastic gradient descent, theoretical computer science}
}
Keywords: |
|
black-box optimization, derivative-free optimization, evolutionary and genetic algorithms, randomized search algorithms, stochastic gradient descent, theoretical computer science |
Collection: |
|
DagRep, Volume 12, Issue 2 |
Issue Date: |
|
2022 |
Date of publication: |
|
23.08.2022 |