Examples: visualization, C++, networks, data cleaning, html widgets, ropensci.

Found 2763 packages in 0.05 seconds

cmce — by Matthew T. Pratola, 8 years ago

Computer Model Calibration for Deterministic and Stochastic Simulators

Implements the Bayesian calibration model described in Pratola and Chkrebtii (2018) for stochastic and deterministic simulators. Additive and multiplicative discrepancy models are currently supported. See < http://www.matthewpratola.com/software> for more information and examples.

modelSelection — by David Rossell, 2 months ago

High-Dimensional Model Selection

Model selection and averaging for regression, generalized linear models, generalized additive models, graphical models and mixtures, focusing on Bayesian model selection and information criteria (Bayesian information criterion etc.). See Rossell (2025) (see the URL field below for its URL) for a hands-on book describing the methods, examples and suggested citations if you use the package.

cIRT — by James Joseph Balamuta, 3 months ago

Choice Item Response Theory

Jointly model the accuracy of cognitive responses and item choices within a Bayesian hierarchical framework as described by Culpepper and Balamuta (2015) . In addition, the package contains the datasets used within the analysis of the paper.

MLModelSelection — by Kuo-Jung Lee, 6 years ago

Model Selection in Multivariate Longitudinal Data Analysis

An efficient Gibbs sampling algorithm is developed for Bayesian multivariate longitudinal data analysis with the focus on selection of important elements in the generalized autoregressive matrix. It provides posterior samples and estimates of parameters. In addition, estimates of several information criteria such as Akaike information criterion (AIC), Bayesian information criterion (BIC), deviance information criterion (DIC) and prediction accuracy such as the marginal predictive likelihood (MPL) and the mean squared prediction error (MSPE) are provided for model selection.

UNCOVER — by Samuel Emerson, 2 years ago

Utilising Normalisation Constant Optimisation via Edge Removal (UNCOVER)

Model data with a suspected clustering structure (either in co-variate space, regression space or both) using a Bayesian product model with a logistic regression likelihood. Observations are represented graphically and clusters are formed through various edge removals or additions. Cluster quality is assessed through the log Bayesian evidence of the overall model, which is estimated using either a Sequential Monte Carlo sampler or a suitable transformation of the Bayesian Information Criterion as a fast approximation of the former. The internal Iterated Batch Importance Sampling scheme (Chopin (2002 )) is made available as a free standing function.

CBnetworkMA — by Garritt L. Page, 2 years ago

Contrast-Based Bayesian Network Meta Analysis

A function that facilitates fitting three types of models for contrast-based Bayesian Network Meta Analysis. The first model is that which is described in Lu and Ades (2006) . The other two models are based on a Bayesian nonparametric methods that permit ties when comparing treatment or for a treatment effect to be exactly equal to zero. In addition to the model fits, the package provides a summary of the interplay between treatment effects based on the procedure described in Barrientos, Page, and Lin (2023) .

bayesplot — by Jonah Gabry, 18 days ago

Plotting for Bayesian Models

Plotting functions for posterior analysis, MCMC diagnostics, prior and posterior predictive checks, and other visualizations to support the applied Bayesian workflow advocated in Gabry, Simpson, Vehtari, Betancourt, and Gelman (2019) . The package is designed not only to provide convenient functionality for users, but also a common set of functions that can be easily used by developers working on a variety of R packages for Bayesian modeling, particularly (but not exclusively) packages interfacing with 'Stan'.

pqrBayes — by Cen Wu, 22 days ago

Bayesian Penalized Quantile Regression

Bayesian regularized quantile regression utilizing two major classes of shrinkage priors (the spike-and-slab priors and the horseshoe family of priors) leads to efficient Bayesian shrinkage estimation, variable selection and valid statistical inference. In this package, we have implemented robust Bayesian variable selection with spike-and-slab priors under high-dimensional linear regression models (Fan et al. (2024) and Ren et al. (2023) ), and regularized quantile varying coefficient models (Zhou et al.(2023) ). In particular, valid robust Bayesian inferences under both models in the presence of heavy-tailed errors can be validated on finite samples. Additional models with spike-and-slab priors include robust Bayesian group LASSO and robust binary Bayesian LASSO (Fan and Wu (2025) ). Besides, robust sparse Bayesian regression with the horseshoe family of (horseshoe, horseshoe+ and regularized horseshoe) priors has also been implemented and yielded valid inference results under heavy-tailed model errors(Fan et al.(2025) ). The Markov chain Monte Carlo (MCMC) algorithms of the proposed and alternative models are implemented in C++.

BAS — by Merlise Clyde, 7 days ago

Bayesian Variable Selection and Model Averaging using Bayesian Adaptive Sampling

Package for Bayesian Variable Selection and Model Averaging in linear models and generalized linear models using stochastic or deterministic sampling without replacement from posterior distributions. Prior distributions on coefficients are from Zellner's g-prior or mixtures of g-priors corresponding to the Zellner-Siow Cauchy Priors or the mixture of g-priors from Liang et al (2008) for linear models or mixtures of g-priors from Li and Clyde (2019) in generalized linear models. Other model selection criteria include AIC, BIC and Empirical Bayes estimates of g. Sampling probabilities may be updated based on the sampled models using sampling w/out replacement or an efficient MCMC algorithm which samples models using a tree structure of the model space as an efficient hash table. See Clyde, Ghosh and Littman (2010) for details on the sampling algorithms. Uniform priors over all models or beta-binomial prior distributions on model size are allowed, and for large p truncated priors on the model space may be used to enforce sampling models that are full rank. The user may force variables to always be included in addition to imposing constraints that higher order interactions are included only if their parents are included in the model. This material is based upon work supported by the National Science Foundation under Division of Mathematical Sciences grant 1106891. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

ordgam — by Philippe Lambert, 2 years ago

Additive Model for Ordinal Data using Laplace P-Splines

Additive proportional odds model for ordinal data using Laplace P-splines. The combination of Laplace approximations and P-splines enable fast and flexible inference in a Bayesian framework. Specific approximations are proposed to account for the asymmetry in the marginal posterior distributions of non-penalized parameters. For more details, see Lambert and Gressani (2023) ; Preprint: ).