Publication Details

Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization

authored by
Julia Guerrero-Viu, Sven Hauns, Sergio Izquierdo, Guilherme Miotto, Simon Schrodi, André Biedenkapp, Thomas Elsken, Difan Deng, Marius Lindauer, Frank Hutter
Abstract

Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at github.com/automl/multi-obj-baselines.

Organisation(s)
Institute of Information Processing
Machine Learning Section
External Organisation(s)
University of Freiburg
Bosch Center for Artificial Intelligence (BCAI)
Type
Conference contribution
No. of pages
22
Publication date
2021
Publication status
E-pub ahead of print
Peer reviewed
Yes
Electronic version(s)
https://arxiv.org/abs/2105.01015 (Access: Open)
https://openreview.net/forum?id=yEGlj93aLFY (Access: Open)