Publication Details

AutoML for Multi-Label Classification: Overview and Empirical Evaluation

authored by
Marcel Wever, Alexander Tornede, Felix Mohr, Eyke Hüllermeier
Abstract

Automated machine learning (AutoML) supports the algorithmic construction and data-specific customization of machine learning pipelines, including the selection, combination, and parametrization of machine learning algorithms as main constituents. Generally speaking, AutoML approaches comprise two major components: a search space model and an optimizer for traversing the space. Recent approaches have shown impressive results in the realm of supervised learning, most notably (single-label) classification (SLC). Moreover, first attempts at extending these approaches towards multi-label classification (MLC) have been made. While the space of candidate pipelines is already huge in SLC, the complexity of the search space is raised to an even higher power in MLC. One may wonder, therefore, whether and to what extent optimizers established for SLC can scale to this increased complexity, and how they compare to each other. This paper makes the following contributions: First, we survey existing approaches to AutoML for MLC. Second, we augment these approaches with optimizers not previously tried for MLC. Third, we propose a benchmarking framework that supports a fair and systematic comparison. Fourth, we conduct an extensive experimental study, evaluating the methods on a suite of MLC problems. We find a grammar-based best-first search to compare favorably to other optimizers.

External Organisation(s)
Paderborn University
Type
Article
Journal
IEEE Transactions on Pattern Analysis and Machine Intelligence
Volume
43
Pages
3037-3054
No. of pages
18
ISSN
0162-8828
Publication date
13.01.2021
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
Software, Computer Vision and Pattern Recognition, Computational Theory and Mathematics, Artificial Intelligence, Applied Mathematics
Electronic version(s)
https://doi.org/10.1109/TPAMI.2021.3051276 (Access: Open)