Found 2763 packages in 0.05 seconds
Computer Model Calibration for Deterministic and Stochastic Simulators
Implements the Bayesian calibration model described
in Pratola and Chkrebtii (2018)
High-Dimensional Model Selection
Model selection and averaging for regression, generalized linear models, generalized additive models, graphical models and mixtures, focusing on Bayesian model selection and information criteria (Bayesian information criterion etc.). See Rossell (2025)
Choice Item Response Theory
Jointly model the accuracy of cognitive responses and item choices
within a Bayesian hierarchical framework as described by Culpepper and
Balamuta (2015)
Model Selection in Multivariate Longitudinal Data Analysis
An efficient Gibbs sampling algorithm is developed for Bayesian multivariate longitudinal data analysis with the focus on selection of important elements in the generalized autoregressive matrix. It provides posterior samples and estimates of parameters. In addition, estimates of several information criteria such as Akaike information criterion (AIC), Bayesian information criterion (BIC), deviance information criterion (DIC) and prediction accuracy such as the marginal predictive likelihood (MPL) and the mean squared prediction error (MSPE) are provided for model selection.
Utilising Normalisation Constant Optimisation via Edge Removal (UNCOVER)
Model data with a suspected clustering structure (either in
co-variate space, regression space or both) using a Bayesian product model
with a logistic regression likelihood. Observations are represented
graphically and clusters are formed through various edge removals or
additions. Cluster quality is assessed through the log Bayesian evidence of
the overall model, which is estimated using either a Sequential Monte Carlo
sampler or a suitable transformation of the Bayesian Information Criterion
as a fast approximation of the former. The internal Iterated Batch
Importance Sampling scheme (Chopin (2002
Contrast-Based Bayesian Network Meta Analysis
A function that facilitates fitting three types of models
for contrast-based Bayesian Network Meta Analysis. The first model is that which
is described in Lu and Ades (2006)
Plotting for Bayesian Models
Plotting functions for posterior analysis, MCMC diagnostics,
prior and posterior predictive checks, and other visualizations
to support the applied Bayesian workflow advocated in
Gabry, Simpson, Vehtari, Betancourt, and Gelman (2019)
Bayesian Penalized Quantile Regression
Bayesian regularized quantile regression utilizing two major classes of shrinkage priors
(the spike-and-slab priors and the horseshoe family of priors) leads to efficient Bayesian
shrinkage estimation, variable selection and valid statistical inference. In this package,
we have implemented robust Bayesian variable selection with spike-and-slab priors under
high-dimensional linear regression models (Fan et al. (2024)
Bayesian Variable Selection and Model Averaging using Bayesian Adaptive Sampling
Package for Bayesian Variable Selection and Model Averaging
in linear models and generalized linear models using stochastic or
deterministic sampling without replacement from posterior
distributions. Prior distributions on coefficients are
from Zellner's g-prior or mixtures of g-priors
corresponding to the Zellner-Siow Cauchy Priors or the
mixture of g-priors from Liang et al (2008)
Additive Model for Ordinal Data using Laplace P-Splines
Additive proportional odds model for ordinal data using Laplace P-splines. The combination of Laplace approximations and P-splines enable fast and flexible inference in a Bayesian framework. Specific approximations are proposed to account for the asymmetry in the marginal posterior distributions of non-penalized parameters. For more details, see Lambert and Gressani (2023)