Estimates the precision of transdimensional Markov chain Monte Carlo
(MCMC) output, which is often used for Bayesian analysis of models with different
dimensionality (e.g., model selection). Transdimensional MCMC (e.g., reversible
jump MCMC) relies on sampling a discrete model-indicator variable to estimate
the posterior model probabilities. If only few switches occur between the models,
precision may be low and assessment based on the assumption of independent
samples misleading. Based on the observed transition matrix of the indicator
variable, the method of Heck, Overstall, Gronau, & Wagenmakers (2019,
Statistics & Computing, 29, 631-643)

- Updated citation and vignette: Paper in Statistics & Computing (doi:10.1007/s11222-018-9828-0)

- Code refactoring
- Renamed functions: table.mc -> transitions; sim.mc -> rmarkov; dirichlet.mle -> fit_dirichlet ; stationary.mle -> stationary_mle ; best.k -> best_models
- Added unit tests
- Fixed bugs for transitions() of multiple-chain sequences and multiple CPUs in stationary()

- Fixed WARNING: Found ‘__assert_fail’, possibly from ‘assert’ (C)

- Registered C++ routines
- Improved Description file

- Alternative method to compute eigenvectors: RcppEigen package
- Improved starting values for Dirichlet estimation algorithm
- Maximum likelihood estimation of stationary distribution: stationary.mle()
- Changed default prior to epsilon=1/M (M= number of sampled models)
- Changed default method to compute eigenvalue decomposition to RcppArmadillo (method="arma")

- Improved estimation of Dirichlet parameters to get effective sample size (C++ version of fixed-point algorithm by Mink, 2000)
- New function best.k() to get summary for the k models with highest posterior model probability
- Exports function rdirichlet()
- Updated licence: GPL-3 (instead of GPL-2)

- New function best.k() to assess estimation uncertainty for the k models with the highest posterior model probabilities

- Implementations with RcppArmadillo::eig_gen and base::eigen