Publication Details

Neural Networks for Predicting Algorithm Runtime Distributions

authored by
Katharina Eggensperger, Marius Lindauer, Frank Hutter
Abstract

Many state-of-the-art algorithms for solving hard combinatorial problems in artificial intelligence (AI) include elements of stochasticity that lead to high variations in runtime, even for a fixed problem instance. Knowledge about the resulting runtime distributions (RTDs) of algorithms on given problem instances can be exploited in various meta-algorithmic procedures, such as algorithm selection, portfolios, and randomized restarts. Previous work has shown that machine learning can be used to individually predict mean, median and variance of RTDs. To establish a new state-of-the-art in predicting RTDs, we demonstrate that the parameters of an RTD should be learned jointly and that neural networks can do this well by directly optimizing the likelihood of an RTD given runtime observations. In an empirical study involving five algorithms for SAT solving and AI planning, we show that neural networks predict the true RTDs of unseen instances better than previous methods, and can even do so when only few runtime observations are available per training instance.

External Organisation(s)
University of Freiburg
Type
Conference contribution
Pages
1442-1448
No. of pages
7
Publication date
2018
Publication status
Published
Peer reviewed
Yes
ASJC Scopus subject areas
Artificial Intelligence
Electronic version(s)
https://arxiv.org/abs/1709.07615 (Access: Open)
https://doi.org/10.24963/ijcai.2018/200 (Access: Open)