This page contains a list of example codes written with Optuna.
The simplest codeblock looks like this:
import optuna
def objective(trial):
x = trial.suggest_float("x", -100, 100)
return x ** 2
if __name__ == "__main__":
study = optuna.create_study()
# The optimization finishes after evaluating 1000 times or 3 seconds.
study.optimize(objective, n_trials=1000, timeout=3)
print(f"Best params is {study.best_params} with value {study.best_value}")
The examples below provide codeblocks similar to the example above for various different scenarios.
- AllenNLP
- AllenNLP (Jsonnet)
- Catalyst
- CatBoost
- Chainer
- ChainerMN
- Dask-ML
- FastAI V1
- FastAI V2
- Haiku
- Gluon
- Keras
- LightGBM
- LightGBM Tuner
- MXNet
- PyTorch
- PyTorch Ignite
- PyTorch Lightning
- PyTorch Lightning (DDP)
- RAPIDS
- Scikit-learn
- Scikit-learn OptunaSearchCV
- Scikit-image
- SKORCH
- Tensorflow
- Tensorflow (eager)
- XGBoost
The following example demonstrates how to use Optuna Dashboard.
The following example demonstrates how to implement an objective function that uses additional arguments other than trial
.
The following example demonstrates how to implement pruning logic with Optuna.
In addition, integration modules are available for the following libraries, providing simpler interfaces to utilize pruning.
- Pruning with Catalyst integration module
- Pruning with CatBoost integration module
- Pruning with Chainer integration module
- Pruning with ChainerMN integration module
- Pruning with FastAI V1 integration module
- Pruning with FastAI V2 integration module
- Pruning with Keras integration module
- Pruning with LightGBM integration module
- Pruning with MXNet integration module
- Pruning with PyTorch integration module
- Pruning with PyTorch Ignite integration module
- Pruning with PyTorch Lightning integration module
- Pruning with PyTorch Lightning integration module (DDP)
- Pruning with Tensorflow integration module
- Pruning with XGBoost integration module
- Pruning with XGBoost integration module (cross validation, XGBoost.cv)
- Hugging Face Trainer's hyperparameter search
- Allegro Trains
- BBO-Rietveld: Automated crystal structure refinement
- Catalyst
- CuPy
- Hydra's Optuna Sweeper plugin
- Mozilla Voice STT
- neptune.ai
- OptGBM: A scikit-learn compatible LightGBM estimator with Optuna
- Optuna-distributed
- PyKEEN
- RL Baselines Zoo
- Hyperparameter Optimization for Machine Learning, code repository for online course
PRs to add additional projects welcome!
You can use our docker images with the tag ending with -dev
to run most of the examples.
For example, you can run PyTorch Simple via docker run --rm -v $(pwd):/prj -w /prj optuna/optuna:py3.7-dev python pytorch/pytorch_simple.py
.
Also, you can try our visualization example in Jupyter Notebook by opening localhost:8888
in your browser after executing this:
docker run -p 8888:8888 --rm optuna/optuna:py3.7-dev jupyter notebook --allow-root --no-browser --port 8888 --ip 0.0.0.0 --NotebookApp.token='' --NotebookApp.password=''