Details zu Publikationen

Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization

verfasst von
Julia Guerrero-Viu, Sven Hauns, Sergio Izquierdo, Guilherme Miotto, Simon Schrodi, André Biedenkapp, Thomas Elsken, Difan Deng, Marius Lindauer, Frank Hutter
Abstract

Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at github.com/automl/multi-obj-baselines.

Organisationseinheit(en)
Institut für Informationsverarbeitung
Fachgebiet Maschinelles Lernen
Externe Organisation(en)
Albert-Ludwigs-Universität Freiburg
Bosch Center for Artificial Intelligence (BCAI)
Typ
Aufsatz in Konferenzband
Anzahl der Seiten
22
Publikationsdatum
2021
Publikationsstatus
Elektronisch veröffentlicht (E-Pub)
Peer-reviewed
Ja
Elektronische Version(en)
https://arxiv.org/abs/2105.01015 (Zugang: Offen)
https://openreview.net/forum?id=yEGlj93aLFY (Zugang: Offen)