Found 2460 packages in 0.33 seconds
Bayesian Additive Regression Trees
Bayesian Additive Regression Trees (BART) provide flexible nonparametric modeling of covariates for continuous, binary, categorical and time-to-event outcomes. For more information see Sparapani, Spanbauer and McCulloch
Discrete Bayesian Additive Regression Trees Sampler
Fits Bayesian additive regression trees (BART; Chipman, George, and McCulloch (2010)
Bayesian Additive Regression Trees
An advanced implementation of Bayesian Additive Regression Trees with expanded features for data analysis and visualization.
Bayesian Additive Regression Trees
This is an implementation of BART:Bayesian Additive Regression Trees, by Chipman, George, McCulloch (2010).
Bayesian Additive Models for Location, Scale, and Shape (and Beyond)
Infrastructure for estimating probabilistic distributional regression models in a Bayesian framework.
The distribution parameters may capture location, scale, shape, etc. and every parameter may depend
on complex additive terms (fixed, random, smooth, spatial, etc.) similar to a generalized additive model.
The conceptual and computational framework is introduced in Umlauf, Klein, Zeileis (2019)
Causal Inference using Bayesian Additive Regression Trees
Contains a variety of methods to generate typical causal inference estimates using Bayesian Additive Regression Trees (BART) as the underlying regression model (Hill (2012)
Bayesian Additive Regression Kernels
Bayesian Additive Regression Kernels (BARK) provides
an implementation for non-parametric function estimation using Levy
Random Field priors for functions that may be represented as a
sum of additive multivariate kernels. Kernels are located at
every data point as in Support Vector Machines, however, coefficients
may be heavily shrunk to zero under the Cauchy process prior, or even,
set to zero. The number of active features is controlled by priors on
precision parameters within the kernels, permitting feature selection. For
more details see Ouyang, Z (2008) "Bayesian Additive Regression Kernels",
Duke University. PhD dissertation, Chapter 3 and Wolpert, R. L, Clyde, M.A,
and Tu, C. (2011) "Stochastic Expansions with Continuous Dictionaries Levy
Adaptive Regression Kernels, Annals of Statistics Vol (39) pages 1916-1962
Mixed GAM Computation Vehicle with Automatic Smoothness Estimation
Generalized additive (mixed) models, some of their extensions and
other generalized ridge regression with multiple smoothing
parameter estimation by (Restricted) Marginal Likelihood,
Generalized Cross Validation and similar, or using iterated
nested Laplace approximation for fully Bayesian inference. See
Wood (2017)
Bayesian Additive Regression Trees for Confounder Selection
Fit Bayesian Regression Additive Trees (BART) models to
select true confounders from a large set of potential confounders and
to estimate average treatment effect. For more information, see Kim et
al. (2023)
Variable Selection Using Bayesian Additive Regression Trees
Bayesian additive regression trees (BART) provides flexible non-parametric modeling of mixed-type predictors for continuous and binary responses. This package is built upon CRAN R package 'BART', version 2.7 (< https://github.com/cran/BART>). It implements the three proposed variable selection approaches in the paper: Luo, C and Daniels, M. J. (2021), "Variable Selection Using Bayesian Additive Regression Trees."