Conceived to calculate Bayes factors in linear models and then to provide a formal Bayesian answer to testing and variable selection problems. From a theoretical side, the emphasis in this package is placed on the prior distributions and it allows a wide range of them: Jeffreys (1961); Zellner and Siow(1980)
Hypothesis testing, model selection and model averaging are important statistical problems that have in common the explicit consideration of the uncertainty about which is the true model. The formal Bayesian tool to solve such problems is the Bayes factor (Kass and Raftery, 1995) that reports the evidence in the data favoring each of the entertained hypotheses/models and can be easily translated to posterior probabilities.
This package has been specifically conceived to calculate Bayes factors
in linear models and then to provide a formal Bayesian answer to testing
and variable selection problems. From a theoretical side, the emphasis
in the package is placed on the prior distributions (a very delicate
issue in this context) and BayesVarSel
allows using a wide range of
them: Jeffreys-Zellner-Siow (Jeffreys, 1961; Zellner and Siow,
1980,1984) Zellner (1986); Fernandez et al. (2001), Liang et al. (2008)
and Bayarri et al. (2012).
The stable version
can be installed using:
install.packages("BayesVarSel")
You can track, download the latest version or contribute to the
development of BayesVarSel
at
https://github.com/comodin19/BayesVarSel. To install the most recent
version of the package (1.8.0) you should:
Install devtools from CRAN with install.packages("devtools")
.
Install the development version of BayesVarSel
from
GitHub:
devtools::install_github("comodin19/BayesVarSel")
The interaction with the package is through a friendly interface that
syntactically mimics the well-known lm command of R
. The resulting
objects can be easily explored providing the user very valuable
information (like marginal, joint and conditional inclusion
probabilities of potential variables; the highest posterior probability
model, HPM; the median probability model, MPM) about the structure of
the true -data generating- model. Additionally, BayesVarSel
incorporates abilities to handle problems with a large number of
potential explanatory variables through parallel and heuristic versions
(Garcia-Donato and Martinez-Beneito, 2013) of the main commands.
library(BayesVarSel)
#> Loading required package: MASS
#> Loading required package: mvtnorm
#> Loading required package: parallel
data(Hald)
hald_Bvs <- Bvs(formula = y ~ x1 + x2 + x3 + x4, data = Hald)
#> Info. . . .
#> Most complex model has 5 covariates
#> From those 1 is fixed and we should select from the remaining 4
#> x1, x2, x3, x4
#> The problem has a total of 16 competing models
#> Of these, the 10 most probable (a posteriori) are kept
#> Working on the problem...please wait.
summary(hald_Bvs)
#>
#> Call:
#> Bvs(formula = y ~ x1 + x2 + x3 + x4, data = Hald)
#>
#> Inclusion Probabilities:
#> Incl.prob. HPM MPM
#> x1 0.9762 * *
#> x2 0.7563 * *
#> x3 0.2624
#> x4 0.4153
#> ---
#> Code: HPM stands for Highest posterior Probability Model and
#> MPM for Median Probability Model.
#>
colMeans(predict(hald_Bvs, Hald[1:2,]))
#>
#> Simulations obtained using the best 10 models
#> that accumulate 1 of the total posterior probability
#> [1] 78.86902 73.09265
# Simulate coefficients
set.seed(171) # For reproducibility of simulations.
sim_coef <- BMAcoeff(hald_Bvs);
#>
#> Simulations obtained using the best 10 models
#> that accumulate 1 of the total posterior probability
colMeans(sim_coef)
#> Intercept x1 x2 x3 x4
#> 70.9736117 1.4166974 0.4331986 -0.0409743 -0.2170662
library(BayesVarSel)
data(Hald)
fullmodel <- y ~ x1 + x2 + x3 + x4
reducedmodel <- y ~ x1 + x2
nullmodel <- y ~ 1
Btest(models = c(H0 = nullmodel, H1 = fullmodel, H2 = reducedmodel), data = Hald)
#> ---------
#> Models:
#> $H0
#> y ~ 1
#>
#> $H1
#> y ~ x1 + x2 + x3 + x4
#>
#> $H2
#> y ~ x1 + x2
#>
#> ---------
#> Bayes factors (expressed in relation to H0)
#> H0.to.H0 H1.to.H0 H2.to.H0
#> 1.0 44300.8 3175456.4
#> ---------
#> Posterior probabilities:
#> H0 H1 H2
#> 0.000 0.014 0.986
Merged Bvs and PBvs in just one function (called Bvs and old PBvs disappears). Bvs now has two extra parameters to control parallelization: parallel and n.nodes.
Several changes in Btest: 1) it now allows for unnamed lists of models and if unnamed lists are provided, default names are given by the function. 2) The prior probabilities argument priorprobs does not have to be named (by default the order of the models in argument models is used). 3) Deprecated argument relax.nest, replaced by the explicit definition of the null model by the user via the argument null.model
In Bvs and GibbsBvs the order of arguments has slightly changed and now data is the second argument followed by formula.
Now plotBvs is a S3 function defined as plot.Bvs
Now predictBvs is a S3 function defined as predict.Bvs
In Bvs the argument fixed.cov is deprecated, now replaced by the formula for the null model in the argument null.model
Option "trace" added to plot
print updated to show the 10 most probable models (among the visited) for method Gibbs
removed the comment at initialization
BMAcoeff no longer shows the graphic for all the variables, instead use histBMA to plot the posterior distribution of the coefficients.