Examples: visualization, C++, networks, data cleaning, html widgets, ropensci.

Found 2460 packages in 0.33 seconds

BART — by Rodney Sparapani, 5 months ago

Bayesian Additive Regression Trees

Bayesian Additive Regression Trees (BART) provide flexible nonparametric modeling of covariates for continuous, binary, categorical and time-to-event outcomes. For more information see Sparapani, Spanbauer and McCulloch .

dbarts — by Vincent Dorie, 7 months ago

Discrete Bayesian Additive Regression Trees Sampler

Fits Bayesian additive regression trees (BART; Chipman, George, and McCulloch (2010) ) while allowing the updating of predictors or response so that BART can be incorporated as a conditional model in a Gibbs/Metropolis-Hastings sampler. Also serves as a drop-in replacement for package 'BayesTree'.

bartMachine — by Adam Kapelner, a year ago

Bayesian Additive Regression Trees

An advanced implementation of Bayesian Additive Regression Trees with expanded features for data analysis and visualization.

BayesTree — by Robert McCulloch, 10 months ago

Bayesian Additive Regression Trees

This is an implementation of BART:Bayesian Additive Regression Trees, by Chipman, George, McCulloch (2010).

bamlss — by Nikolaus Umlauf, a month ago

Bayesian Additive Models for Location, Scale, and Shape (and Beyond)

Infrastructure for estimating probabilistic distributional regression models in a Bayesian framework. The distribution parameters may capture location, scale, shape, etc. and every parameter may depend on complex additive terms (fixed, random, smooth, spatial, etc.) similar to a generalized additive model. The conceptual and computational framework is introduced in Umlauf, Klein, Zeileis (2019) and the R package in Umlauf, Klein, Simon, Zeileis (2021) .

bartCause — by Vincent Dorie, 2 months ago

Causal Inference using Bayesian Additive Regression Trees

Contains a variety of methods to generate typical causal inference estimates using Bayesian Additive Regression Trees (BART) as the underlying regression model (Hill (2012) ).

bark — by Merlise Clyde, 2 months ago

Bayesian Additive Regression Kernels

Bayesian Additive Regression Kernels (BARK) provides an implementation for non-parametric function estimation using Levy Random Field priors for functions that may be represented as a sum of additive multivariate kernels. Kernels are located at every data point as in Support Vector Machines, however, coefficients may be heavily shrunk to zero under the Cauchy process prior, or even, set to zero. The number of active features is controlled by priors on precision parameters within the kernels, permitting feature selection. For more details see Ouyang, Z (2008) "Bayesian Additive Regression Kernels", Duke University. PhD dissertation, Chapter 3 and Wolpert, R. L, Clyde, M.A, and Tu, C. (2011) "Stochastic Expansions with Continuous Dictionaries Levy Adaptive Regression Kernels, Annals of Statistics Vol (39) pages 1916-1962 .

mgcv — by Simon Wood, a year ago

Mixed GAM Computation Vehicle with Automatic Smoothness Estimation

Generalized additive (mixed) models, some of their extensions and other generalized ridge regression with multiple smoothing parameter estimation by (Restricted) Marginal Likelihood, Generalized Cross Validation and similar, or using iterated nested Laplace approximation for fully Bayesian inference. See Wood (2017) for an overview. Includes a gam() function, a wide variety of smoothers, 'JAGS' support and distributions beyond the exponential family.

bartcs — by Yeonghoon Yoo, 7 months ago

Bayesian Additive Regression Trees for Confounder Selection

Fit Bayesian Regression Additive Trees (BART) models to select true confounders from a large set of potential confounders and to estimate average treatment effect. For more information, see Kim et al. (2023) .

BartMixVs — by Chuji Luo, 3 years ago

Variable Selection Using Bayesian Additive Regression Trees

Bayesian additive regression trees (BART) provides flexible non-parametric modeling of mixed-type predictors for continuous and binary responses. This package is built upon CRAN R package 'BART', version 2.7 (< https://github.com/cran/BART>). It implements the three proposed variable selection approaches in the paper: Luo, C and Daniels, M. J. (2021), "Variable Selection Using Bayesian Additive Regression Trees." , and other three existing BART-based variable selection approaches.