论文标题
简单且可扩展的并行化贝叶斯优化
Simple and Scalable Parallelized Bayesian Optimization
论文作者
论文摘要
近年来,利用并行和分布式计算资源对于解决高计算成本的问题至关重要。贝叶斯优化(BO)在那些昂贵的评估问题(例如机器学习算法的高参数优化)中表现出了吸引力的结果。尽管已经开发了许多并行的BO方法来有效地利用这些计算资源,但这些方法假定同步设置或不可扩展。在本文中,我们提出了一种简单可扩展的BO方法,用于异步并行设置。实验是通过基准功能和多层感知器的高参数优化进行的,这证明了该方法的有希望的性能。
In recent years, leveraging parallel and distributed computational resources has become essential to solve problems of high computational cost. Bayesian optimization (BO) has shown attractive results in those expensive-to-evaluate problems such as hyperparameter optimization of machine learning algorithms. While many parallel BO methods have been developed to search efficiently utilizing these computational resources, these methods assumed synchronous settings or were not scalable. In this paper, we propose a simple and scalable BO method for asynchronous parallel settings. Experiments are carried out with a benchmark function and hyperparameter optimization of multi-layer perceptrons, which demonstrate the promising performance of the proposed method.