Found 2459 packages in 0.06 seconds
Discrete Bayesian Additive Regression Trees Sampler
Fits Bayesian additive regression trees (BART; Chipman, George, and McCulloch (2010)
Bayesian Additive Regression Trees
An advanced implementation of Bayesian Additive Regression Trees with expanded features for data analysis and visualization.
Bayesian Additive Regression Trees
This is an implementation of BART:Bayesian Additive Regression Trees, by Chipman, George, McCulloch (2010).
Bayesian Additive Regression Trees
Bayesian Additive Regression Trees (BART) provide flexible nonparametric modeling of covariates for continuous, binary, categorical and time-to-event outcomes. For more information see Sparapani, Spanbauer and McCulloch
Bayesian Additive Regression Kernels
Bayesian Additive Regression Kernels (BARK) provides
an implementation for non-parametric function estimation using Levy
Random Field priors for functions that may be represented as a
sum of additive multivariate kernels. Kernels are located at
every data point as in Support Vector Machines, however, coefficients
may be heavily shrunk to zero under the Cauchy process prior, or even,
set to zero. The number of active features is controlled by priors on
precision parameters within the kernels, permitting feature selection. For
more details see Ouyang, Z (2008) "Bayesian Additive Regression Kernels",
Duke University. PhD dissertation, Chapter 3 and Wolpert, R. L, Clyde, M.A,
and Tu, C. (2011) "Stochastic Expansions with Continuous Dictionaries Levy
Adaptive Regression Kernels, Annals of Statistics Vol (39) pages 1916-1962
Mixed GAM Computation Vehicle with Automatic Smoothness Estimation
Generalized additive (mixed) models, some of their extensions and
other generalized ridge regression with multiple smoothing
parameter estimation by (Restricted) Marginal Likelihood,
Generalized Cross Validation and similar, or using iterated
nested Laplace approximation for fully Bayesian inference. See
Wood (2017)
Bayesian Additive Regression Trees for Confounder Selection
Fit Bayesian Regression Additive Trees (BART) models to
select true confounders from a large set of potential confounders and
to estimate average treatment effect. For more information, see Kim et
al. (2023)
Variable Selection Using Bayesian Additive Regression Trees
Bayesian additive regression trees (BART) provides flexible non-parametric modeling of mixed-type predictors for continuous and binary responses. This package is built upon CRAN R package 'BART', version 2.7 (< https://github.com/cran/BART>). It implements the three proposed variable selection approaches in the paper: Luo, C and Daniels, M. J. (2021), "Variable Selection Using Bayesian Additive Regression Trees."
Causal Inference using Bayesian Additive Regression Trees
Contains a variety of methods to generate typical causal inference estimates using Bayesian Additive Regression Trees (BART) as the underlying regression model (Hill (2012)
Nonparametric Failure Time Bayesian Additive Regression Trees
Nonparametric Failure Time (NFT) Bayesian Additive Regression Trees (BART): Time-to-event Machine Learning with Heteroskedastic Bayesian Additive Regression Trees (HBART) and Low Information Omnibus (LIO) Dirichlet Process Mixtures (DPM). An NFT BART model is of the form Y = mu + f(x) + sd(x) E where functions f and sd have BART and HBART priors, respectively, while E is a nonparametric error distribution due to a DPM LIO prior hierarchy. See the following for a complete description of the model at