Light Gradient Boosting Machine

Tree based algorithms can be improved by introducing boosting frameworks. 'LightGBM' is one such framework, based on Ke, Guolin et al. (2017) <>. This package offers an R interface to work with it. It is designed to be distributed and efficient with the following advantages: 1. Faster training speed and higher efficiency. 2. Lower memory usage. 3. Better accuracy. 4. Parallel learning supported. 5. Capable of handling large-scale data. In recognition of these advantages, 'LightGBM' has been widely-used in many winning solutions of machine learning competitions. Comparison experiments on public datasets suggest that 'LightGBM' can outperform existing boosting frameworks on both efficiency and accuracy, with significantly lower memory consumption. In addition, parallel experiments suggest that in certain circumstances, 'LightGBM' can achieve a linear speed-up in training time by using multiple machines.


Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.


3.3.2 by Yu Shi, 14 days ago

Report a bug at

Browse source code at

Authors: Yu Shi [aut, cre] , Guolin Ke [aut] , Damien Soukhavong [aut] , James Lamb [aut] , Qi Meng [aut] , Thomas Finley [aut] , Taifeng Wang [aut] , Wei Chen [aut] , Weidong Ma [aut] , Qiwei Ye [aut] , Tie-Yan Liu [aut] , Nikita Titov [aut] , Yachen Yan [ctb] , Microsoft Corporation [cph] , Dropbox , Inc. [cph] , Jay Loden [cph] , Dave Daeschler [cph] , Giampaolo Rodola [cph] , Alberto Ferreira [ctb] , Daniel Lemire [ctb] , Victor Zverovich [cph] , IBM Corporation [ctb] , David Cortes [ctb]

Documentation:   PDF Manual  

Task views: Model Deployment with R

MIT + file LICENSE license

Imports data.table, graphics, jsonlite, Matrix, methods, utils

Depends on R6

Suggests testthat

System requirements: C++11

Suggested by EIX, SHAPforxgboost, fastshap.

See at CRAN