Bayesian Optimization and Model-Based Optimization of Expensive Black-Box Functions

Flexible and comprehensive R toolbox for model-based optimization ('MBO'), also known as Bayesian optimization. It implements the Efficient Global Optimization Algorithm and is designed for both single- and multi- objective optimization with mixed continuous, categorical and conditional parameters. The machine learning toolbox 'mlr' provide dozens of regression learners to model the performance of the target algorithm with respect to the parameter settings. It provides many different infill criteria to guide the search process. Additional features include multi-point batch proposal, parallel execution as well as visualization and sophisticated logging mechanisms, which is especially useful for teaching and understanding of algorithm behavior. 'mlrMBO' is implemented in a modular fashion, such that single components can be easily replaced or adapted by the user for specific use cases.

CRAN_Status_Badge Build Status Build status Coverage Status Monthly RStudio CRAN Downloads

Model-based optimization with mlr.


We reccomend to install the official release version:


For experimental use you can install the latest development version:



MBO demo

mlrMBO is a highly configurable R toolbox for model-based / Bayesian optimization of black-box functions.


  • EGO-type algorithms (Kriging with expected improvement) on purely numerical search spaces, see Jones et al. (1998)
  • Mixed search spaces with numerical, integer, categorical and subordinate parameters
  • Arbitrary parameter transformation allowing to optimize on, e.g., logscale
  • Optimization of noisy objective functions
  • Multi-Criteria optimization with approximated Pareto fronts
  • Parallelization through multi-point batch proposals
  • Parallelization on many parallel back-ends and clusters through batchtools and parallelMap

For the surrogate, mlrMBO allows any regression learner from mlr, including:

  • Kriging aka. Gaussian processes (i.e. DiceKriging)
  • random Forests (i.e. randomForest)
  • and many more...

Various infill criteria (aka. acquisition functions) are available:

  • Expected improvement (EI)
  • Upper/Lower confidence bound (LCB, aka. statistical lower or upper bound)
  • Augmented expected improvement (AEI)
  • Expected quantile improvement (EQI)
  • API for custom infill criteria

Objective functions are created with package smoof, which also offers many test functions for example runs or benchmarks.

Parameter spaces and initial designs are created with package ParamHelpers.

mlrMBO - How to Cite and Citing Publications

Please cite our arxiv paper (Preprint). You can get citation info via citation("mlrMBO") or copy the following BibTex entry:

  title = {{{mlrMBO}}: {{A Modular Framework}} for {{Model}}-{{Based Optimization}} of {{Expensive Black}}-{{Box Functions}}},
  url = {},
  shorttitle = {{{mlrMBO}}},
  archivePrefix = {arXiv},
  eprinttype = {arxiv},
  eprint = {1703.03373},
  primaryClass = {stat},
  author = {Bischl, Bernd and Richter, Jakob and Bossek, Jakob and Horn, Daniel and Thomas, Janek and Lang, Michel},
  date = {2017-03-09},

Some parts of the package were created as part of other publications. If you use these parts, please cite the relevant work appropriately:


mlrMBO 1.1.2

  • Adaptive infill criterions. Infill criterions now have to support an progress argument. Termination criterions now can supply a progress return value.
  • Fix for parEGO + EI (Issue #407)
  • save.on.disk now can take arbitrary numeric vectors to specify iterations, when to save on disk.
  • Spelling mistakes for infill criterions will now be cought. (Issue #417)

mlrMBO 1.1.1

  • makeMBOControl() has on.surrogate.error argument which enables random proposals if the surrogate model fails.
  • With initSMBO(), updateSMBO() and finalizeSMBO() it is now possible to do a human-in-the-loop MBO.
  • The result now contains the final.opt.state.
  • Plot method for OptState objects.

mlrMBO 1.1.0

  • Fixed bug in focus search that affected discrete search spaces.
  • Numerics will be auto converted to integers where integers are expected.
  • Package can now be cited with citation("mlrMBO").

mlrMBO 1.0.0

  • Initial CRAN release

Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.


1.1.5 by Jakob Richter, a year ago

Report a bug at

Browse source code at

Authors: Bernd Bischl [aut] , Jakob Richter [aut, cre] , Jakob Bossek [aut] , Daniel Horn [aut] , Michel Lang [aut] , Janek Thomas [aut]

Documentation:   PDF Manual  

Task views: Optimization and Mathematical Programming

BSD_2_clause + file LICENSE license

Imports backports, BBmisc, checkmate, data.table, lhs, parallelMap

Depends on mlr, ParamHelpers, smoof

Suggests akima, cmaesr, ggplot2, DiceKriging, earth, emoa, GGally, gridExtra, kernlab, kknn, knitr, mco, nnet, party, randomForest, reshape2, rmarkdown, rgenoud, rpart, testthat, covr

Imported by varycoef.

Depended on by tramnet, tuneRanger.

Suggested by ChemoSpec2D, mlr, mlrintermbo, mosmafs.

See at CRAN