A Simple Implementation and Demonstration of Gradient Boosting

A basic, clear implementation of tree-based gradient boosting designed to illustrate the core operation of boosting models. Tuning parameters (such as stochastic subsampling, modified learning rate, or regularization) are not implemented. The only adjustable parameter is the number of training rounds. If you are looking for a high performance boosting implementation with tuning parameters, consider the 'xgboost' package.


News

Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.

install.packages("DidacticBoost")

0.1.1 by David Shaub, 3 years ago


https://github.com/dashaub/DidacticBoost


Report a bug at https://github.com/dashaub/DidacticBoost/issues


Browse source code at https://github.com/cran/DidacticBoost


Authors: David Shaub [aut, cre]


Documentation:   PDF Manual  


GPL-3 license


Depends on rpart

Suggests testthat


See at CRAN