Details zu Publikationen

Neural Networks for Predicting Algorithm Runtime Distributions

verfasst von
Katharina Eggensperger, Marius Lindauer, Frank Hutter
Abstract

Many state-of-the-art algorithms for solving hard combinatorial problems in artificial intelligence (AI) include elements of stochasticity that lead to high variations in runtime, even for a fixed problem instance. Knowledge about the resulting runtime distributions (RTDs) of algorithms on given problem instances can be exploited in various meta-algorithmic procedures, such as algorithm selection, portfolios, and randomized restarts. Previous work has shown that machine learning can be used to individually predict mean, median and variance of RTDs. To establish a new state-of-the-art in predicting RTDs, we demonstrate that the parameters of an RTD should be learned jointly and that neural networks can do this well by directly optimizing the likelihood of an RTD given runtime observations. In an empirical study involving five algorithms for SAT solving and AI planning, we show that neural networks predict the true RTDs of unseen instances better than previous methods, and can even do so when only few runtime observations are available per training instance.

Externe Organisation(en)
Albert-Ludwigs-Universität Freiburg
Typ
Aufsatz in Konferenzband
Seiten
1442-1448
Anzahl der Seiten
7
Publikationsdatum
2018
Publikationsstatus
Veröffentlicht
Peer-reviewed
Ja
ASJC Scopus Sachgebiete
Artificial intelligence
Elektronische Version(en)
https://arxiv.org/abs/1709.07615 (Zugang: Offen)
https://doi.org/10.24963/ijcai.2018/200 (Zugang: Offen)