Skip to content

Optuna: A Next-generation Hyperparameter Optimization Framework

Authors: Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, Masanori Koyama

Published: 2019 (Conference Paper)

Source: ACM SIGKDD International Conference on Knowledge Discovery and Data Mining

Algorithm: Optuna

arXiv: 1907.10902

DOI: 10.1145/3292500.3330701

Summary

Introduces Optuna as a define-by-run hyperparameter optimization framework that lets users construct dynamic search spaces in ordinary Python code. The paper emphasizes practical system design: search and pruning algorithms, simple single-machine use, and scalable distributed execution.

Abstract

The purpose of this study is to introduce new design-criteria for next-generation hyperparameter optimization software. The criteria we propose include (1) define-by-run API that allows users to construct the parameter search space dynamically, (2) efficient implementation of both searching and pruning strategies, and (3) easy-to-setup, versatile architecture that can be deployed for various purposes, ranging from scalable distributed computing to light-weight experiment conducted via interactive interface. In order to prove our point, we will introduce Optuna, an optimization software which is a culmination of our effort in the development of a next generation optimization software. As an optimization software designed with define-by-run principle, Optuna is particularly the first of its kind. We will present the design-techniques that became necessary in the development of the software that meets the above criteria, and demonstrate the power of our new design through experimental results and real world applications. Our software is available under the MIT license (https://github.com/pfnet/optuna/).

Tags

  • Optuna

  • Hyperparameter optimization

  • Black-box optimization

  • Define-by-run

  • Pruning

  • Distributed optimization

  • Machine learning systems