Found 2772 packages in 0.04 seconds
Nonparametric Failure Time Bayesian Additive Regression Trees
Nonparametric Failure Time (NFT) Bayesian Additive Regression Trees (BART): Time-to-event Machine Learning with Heteroskedastic Bayesian Additive Regression Trees (HBART) and Low Information Omnibus (LIO) Dirichlet Process Mixtures (DPM). An NFT BART model is of the form Y = mu + f(x) + sd(x) E where functions f and sd have BART and HBART priors, respectively, while E is a nonparametric error distribution due to a DPM LIO prior hierarchy. See the following for a description of the model at
Co-Data Learning for Bayesian Additive Regression Trees
Estimate prior variable weights for Bayesian Additive Regression
Trees (BART). These weights correspond to the probabilities of the variables
being selected in the splitting rules of the sum-of-trees.
Weights are estimated using empirical Bayes and external information on
the explanatory variables (co-data).
BART models are fitted using the 'dbarts' 'R' package.
See Goedhart and others (2023)
Bayesian Additive Regression Trees using Bayesian Model Averaging
"BART-BMA Bayesian Additive Regression Trees using Bayesian Model Averaging" (Hernandez B, Raftery A.E., Parnell A.C. (2018)
Bayesian Additive Regression Trees with Stan-Sampled Parametric Extensions
Fits semiparametric linear and multilevel models with non-parametric additive Bayesian additive regression tree (BART; Chipman, George, and McCulloch (2010)
Iterative Bayesian Additive Regression Trees Descriptor Selection Method
A statistical method based on Bayesian Additive Regression Trees with Global
Standard Error Permutation Test (BART-G.SE) for descriptor selection
and symbolic regression. It finds the symbolic formula of the regression function
y=f(x) as described in Ye, Senftle, and Li (2023)
Fit Varying Coefficient Models with Bayesian Additive Regression Trees
Fits linear varying coefficient (VC) models, which assert a linear relationship between an outcome and several covariates but allow that relationship (i.e., the coefficients or slopes in the linear regression) to change as functions of additional variables known as effect modifiers, by approximating the coefficient functions with Bayesian Additive Regression Trees. Implements a Metropolis-within-Gibbs sampler to simulate draws from the posterior over coefficient function evaluations. VC models with independent observations or repeated observations can be fit. For more details see Deshpande et al. (2024)
Mixed GAM Computation Vehicle with Automatic Smoothness Estimation
Generalized additive (mixed) models, some of their extensions and
other generalized ridge regression with multiple smoothing
parameter estimation by (Restricted) Marginal Likelihood,
Cross Validation and similar, or using iterated nested Laplace
approximation for fully Bayesian inference. See Wood (2025)
Bayesian Applied Regression Modeling via Stan
Estimates previously compiled regression models using the 'rstan' package, which provides the R interface to the Stan C++ library for Bayesian estimation. Users specify models via the customary R syntax with a formula and data.frame plus some additional arguments for priors.
Bayesian Variable Selection and Model Choice for Generalized Additive Mixed Models
Bayesian variable selection, model choice, and regularized estimation for (spatial) generalized additive mixed regression models via stochastic search variable selection with spike-and-slab priors.
Bayesian Generalized Additive Model Selection
Generalized additive model selection via approximate Bayesian inference is provided. Bayesian mixed model-based penalized splines with spike-and-slab-type coefficient prior distributions are used to facilitate fitting and selection. The approximate Bayesian inference engine options are: (1) Markov chain Monte Carlo and (2) mean field variational Bayes. Markov chain Monte Carlo has better Bayesian inferential accuracy, but requires a longer run-time. Mean field variational Bayes is faster, but less accurate. The methodology is described in He and Wand (2024)