Cognitive Models

Hierarchical Bayesian models. The package provides tools to fit two response time models, using the population-based Markov Chain Monte Carlo.


ggdmc is a generic tool for conducting hierarchical Bayesian computation on cognitive (RT) models. The package uses the population-based Markov chain Monte Carlo (pMCMC).

Getting Started

This example demonstrates the Wiener diffusion model. For other models, see my tutorials site. The naming of R functions in ggdmc attempts to inform the user what the functions are for. For example, BuildModel is to build a model object.

As the user is often reminded in using Bayesian tools, it is always a good practice to check the result of a model fit. Note also the sequence of parameters in a parameter vector (i.e., p.vector) must follow the sequence in the p.vector reported by BuildModel. Some build-in checks will try to safeguard this, but they are far from bulletproof.

Fit a fixed-effect model to a participant

## Set up model ----
## fixing sv & sz to 0, makes to set up a Wiener diffusion model
require(ggdmc)
model <- BuildModel(
  p.map     = list(a = "1", v="1", z="1", d="1", sz="1", sv="1", t0="1", 
                   st0="1"),
  match.map = list(M = list(s1 = "r1", s2 = "r2")),
  factors   = list(S = c("s1", "s2")),
  responses = c("r1","r2"),
  constants = c(st0 = 0, d = 0, sv = 0, sz = 0),  
  type      = "rd")   

npar <- length(GetPNames(model))
p.vector <- c(a=1, v=1.5, z=0.5, t0=.15)
dat <- simulate(model, nsim = 50, ps = p.vector)
dmi <- BuildDMI(dat, model)

p.prior <- BuildPrior(
  dists = rep("tnorm", npar),
  p1=c(a=1, v=0, z=1, t0=1),
  p2=c(a=1, v=2, z=1, t0=1),
  lower = c(0, -5, rep(0, 2)),
  upper = rep(NA, npar))

## Fit model -------------
fit0 <- StartNewsamples(dmi, p.prior)
fit  <- run(fit0)

## Check model -----------
plot(fit)
plot(fit, den = TRUE)
plot(fit, pll = FALSE)
plot(fit, pll = FALSE, den = TRUE)

(isconv <- gelman(fit))
est    <- summary(fit, recovery = TRUE, ps = p.vector, verbose = TRUE)

How to fit fixed-effect and hierarchical model with multiple participants

require(ggdmc);
model <- BuildModel(
  p.map     = list(a = "1", v ="1", z ="1", d ="1", sz ="1", sv ="1", t0 ="1", 
                   st0 ="1"),
  match.map = list(M = list(s1 = "r1", s2 = "r2")),
  factors   = list(S = c("s1", "s2")),
  responses = c("r1","r2"),
  constants = c(st0 = 0, d = 0, sv = 0, sz = 0),
  type      = "rd")

npar <- length(GetPNames(model))
pop.mean  <- c(a = 2,   v = 4,  z = 0.5, t0 = 0.3)
pop.scale <- c(a = 0.5, v = .5, z = 0.1, t0 = 0.05)
pop.prior <- BuildPrior(
    dists = rep("tnorm", npar),
    p1    = pop.mean,
    p2    = pop.scale,
    lower = c(0,-5,  0, 0),
    upper = c(5, 7,  1, 1))

## Simulate some data
dat <- simulate(model, nsub = 50, nsim = 30, prior = pop.prior)
dmi <- BuildDMI(dat, model)
ps <- attr(dat, "parameters")

p.prior <- BuildPrior(
    dists = rep("tnorm", npar),
    p1    = pop.mean,
    p2    = pop.scale*5,
    lower = c(0,-5, 0, 0),
    upper = c(5, 7, 1, 1))

plot(p.prior, ps = ps)  ## Check if all true pvectors in the range of prior

## Sampling separately
fit0 <- StartNewsamples(dmi, p.prior, ncore = 4)
fit  <- run(fit0, 5e2, ncore = 4)
fit  <- run(fit, 1e2, add = TRUE, ncore = 4)  ## add additional 100 samples

## Check model -----
isconv <- gelman(fit, verbose = TRUE)
plot(fit)
est0 <- summary(fit, recovery = TRUE, ps = ps, verbose = TRUE)

## Sampling hierarchically
mu.prior <- BuildPrior(
    dists = rep("tnorm", npar),
    p1    = pop.mean,
    p2    = pop.scale*5,
    lower = c(0,-5,  0, 0),
    upper = c(5, 7,  1, 1))

sigma.prior <- BuildPrior(
    dists = rep("beta", npar),
    p1    = c(a=1, v=1, z=1, t0=1),
    p2    = rep(1, npar),
    upper = rep(1, npar))

## !!!The names are important!!!
priors <- list(pprior = p.prior, location = mu.prior, scale = sigma.prior)
names(priors)
## [1] "pprior"   "location" "scale"

## Fit hierarchical model ----
fit0 <- StartNewsamples(dmi, priors)
fit  <- run(fit0, 5e2)

p0 <- plot(fit, hyper = TRUE)
p0 <- plot(fit, hyper = TRUE, den = TRUE, pll=FALSE)

## Check model -----------
res  <- hgelman(fit, verbose = TRUE)
est0 <- summary(fit, recovery = TRUE, ps = ps, verbose = TRUE)
est1 <- summary(fit, hyper = TRUE, recovery = TRUE, ps = pop.mean,  type = 1, verbose = TRUE)
est2 <- summary(fit, hyper = TRUE, recovery = TRUE, ps = pop.scale, type = 2, verbose = TRUE)

for(i in 1:length(fit))
{
  est <- summary(fit[[i]], recovery = TRUE, ps = ps[i,], verbose=TRUE)
}

List of models currently hard-wired in ggdmc

  1. The LBA model, type = "norm",
  2. The DDM, type = "rd",
  3. The Wiener diffusion, type = "rd" and set sv=0 and sz=0

PDA-based models

  1. The Piecewise LBA model 0; CPU-based PDA likelihoods; type = "plba0",
  2. The Piecewise LBA model 1; CPU-based PDA likelihoods; type = "plba1",
  3. The Piecewise LBA model 0; GPU-based PDA likelihoods; type = "plba0_gpu",
  4. The Piecewise LBA model 1; GPU-based PDA likelihoods; type = "plba1_gpu",
  5. The LBA model; GPU-based PDA likelihoods;, type = "norm_pda_gpu",
  6. The correlated accumualtor model; type = "cnorm".

4 to 9 are separated from the latest version of the package. For these PDA-based models see my BRM paper and associated packages there.

For the details regarding PLBA types, please see Holmes, Trueblood, and Heathcote (2016)

Further information

One aim in designing ggdmc is to read objects from DMC, so they share some similarities. They have however some differences. For example, in the latest version of ggdmc, the dimension of theta and phi arrays are 'npar x nchain x nmc'. DMC uses 'nchain x npar x nmc'. The dimension of the 'log_likelihoods' and 'summed_log_prior' matrices are 'nchain x nmc'. DMC uses 'nmc x nchain'. Remember to transpose them, if you want to operate objects back-and-forth. Convenient functions, using 'aperm' and 't', for doing this will be added in the future version.

Please see my tutorials site, Cognitive Model, for more examples.

Prerequisites

  • R (>= 3.3.0)
  • R packages: Rcpp (>= 0.12.10), RcppArmadillo (>= 0.7.700.3.0), ggplot2 (>= 2.1.0), coda (>= 0.16-1), matrixStats, data.table
  • Windows users need Rtools (>= 3.3.0.1959)
  • Mac OS users need to make clang understand OpenMP flag.
  • Linux/Unix users may need to install Open MPI library, if it has not been installed.
  • Armadillo may need a recent g++ compiler > 4.6

Installation

From CRAN (0.2.6.0):

From source:

install.packages("ggdmc_0.2.6.0.tar.gz", repos = NULL, type="source")

From GitHub (you need devtools) (0.2.6.0):

devtools::install_github(“yxlin/ggdmc”)

For Mac Users:

1. Install gfortran. As to 27, Aug, 2018, the gfortran version has to be 6.1, even you are using a macOS High Sierra Version 10.13.4. gfortran 6.3 may not work.

2. Install clang4-r. James Balamuta has created a convenient tool, clang4-r. Once you install clang4-r, your clang will then understand the OpenMP flag in ggdmc. The aim is to allow macOS to understand OpenMP flag, so you may use other methods for that purpose, if you do not want to install clang4-r. The clang4-r is the most straightforward we found so far. However we have not looked into the source code of clang4-r. Use it at your own risk.

A configure script now disables OpenMP, so macOS users can install without encountering the OpenMP problem.

Citation

If you use this package, please cite the software, for example:

Lin, Y.-S and Strickland, L.. (in preparation). Evidence accumulation models with R: A practical guide to hierarchical Bayesian methods.

Contributors

The R documentation, tutorials, C++ codes, parallel computations, new genetic algorithm, R helper functions and R packaging are developed by Yi-Shin Lin. DMC is developed by Andrew Heathcote (Heathcote et al., 2018), where you may find more different and intersting models.

Please report bugs to me.

License

GPL-2

Acknowledgments

  • The PDF, CDF and random number generation of DDM are derived from Voss & Voss's fast-dm 30.2 and rtdists 0.9-0.
  • Truncated normal functions are originally based on Jonathan Olmsted's RcppTN 0.1-8 at https://github.com/olmjo/RcppTN, Christopher Jackson's R codes in msm package, and Robert (1995, Statistics & Computing).
  • Thanks to Matthew Gretton's consultation regarding the rtdists.
  • Thanks to Andrew Heathcote for lending me his MacBook Air. ggdmc works on OS X (macOS High Sierra Version 10.13.4)

News

0.2.5.2 Yi-Shin Lin [email protected]

  • Add random shuffle across all crossover operator
  • allow posdrift switch passing from BuildModel

0.2.3.6 Yi-Shin Lin [email protected]

  • Remove OpenMP feature in PLBA model

0.2.3.6 Yi-Shin Lin [email protected]

  • Rewrite documentations for all functions
  • Change function naming convention
  • add -lgomp flag
  • Use simple S3 method to resolve the issue of ".dmc" usage in DMC

0.1.9.7 Yi-Shin Lin [email protected]

- Fix hierarchical Bayesian for LBA model and DDM

0.1.5 Yi-Shin Lin [email protected]

- Add LBA PDF, prior C++ functions
- Reverse back to R crossover, migrate, run, samples etc functions to make it
very compatible to DMC

0.1.3.9 Yi-Shin Lin [email protected]

- Fix more 'overloading ambiguity' errors in tnorm_wrapper.cpp

0.1.3.8 Yi-Shin Lin [email protected]

- Add init.c to register native routines.

0.1.3.7 Yi-Shin Lin [email protected]

- Update to Rcpp 0.12.10 and RcppArmadillo >= 0.7.700.0.0 to resolve
  'Rcpp::Nullable' problem.

0.1.3.6 Yi-Shin Lin [email protected]

- Correct 'overloading ambiguity' installation failures on Solaris
- Update the link in README.md to canonical url,
'https://CRAN.R-project.org/package=RcppArmadillo'

0.1.3.5 Yi-Shin Lin [email protected]

- First release to the public

0.1.3.4 Yi-Shin Lin [email protected]

- improve ddmc.Rd, dmc.Rd, dprior.Rd.
- add dmc.R to generate ggdmc-package.Rd
- merge DMC's and ggdmc's plot.prior and rename drawPrior as plot.priors
- now ggdmc has its own classes dmc, dmc.list and hyper
- ggdmc now uses S3 class to dispatch different summary and plot functions.
- Finally, LBA model now is in ggdmc. likelihood.dmc and transform.dmc branch
out to e.g., logLik.ddm, logLik.lba_B, becasue two new classes, ddm and
lba_B are added.
- New modules ggdmc_class.R and ggdmc_eam.R are added. The former defines
DMC classes and the latter collects evidence accumulator models.
- Add more help pages.
- profile.dmc merge with stats's profile
- roxygen2 now is also in charge of NAMESPACE

0.1.3.0 Yi-Shin Lin [email protected]

- plot.dmc add function to superimpose prior distribution on top of
posterior distribution (ggplot2).
- DMC needs to call modified coda to use this function (base).

Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.

install.packages("ggdmc")

0.2.6.0 by Yi-Shin Lin, 2 months ago


https://github.com/yxlin/ggdmc


Report a bug at https://github.com/yxlin/ggdmc/issues


Browse source code at https://github.com/cran/ggdmc


Authors: Yi-Shin Lin [aut, cre] , Andrew Heathcote [aut]


Documentation:   PDF Manual  


GPL-2 license


Imports Rcpp, coda, stats, utils, ggplot2, matrixStats, data.table

Suggests testthat

Linking to Rcpp, RcppArmadillo


See at CRAN