Found 61 packages in 0.01 seconds
Derivatives of the First-Passage Time Density and Cumulative Distribution Function, and Random Sampling from the (Truncated) First-Passage Time Distribution
First, we provide functions to calculate the partial derivative of the first-passage time diffusion probability density function (PDF) and cumulative
distribution function (CDF) with respect to the first-passage time t (only for PDF), the upper barrier a, the drift rate v, the relative starting point w, the
non-decision time t0, the inter-trial variability of the drift rate sv, the inter-trial variability of the rel. starting point sw, and the inter-trial variability
of the non-decision time st0. In addition the PDF and CDF themselves are also provided. Most calculations are done on the logarithmic scale to make it more stable.
Since the PDF, CDF, and their derivatives are represented as infinite series, we give the user the option to control the approximation errors with the argument
'precision'. For the numerical integration we used the C library cubature by Johnson, S. G. (2005-2013) < https://github.com/stevengj/cubature>. Numerical integration is
required whenever sv, sw, and/or st0 is not zero. Note that numerical integration reduces speed of the computation and the precision cannot be guaranteed
anymore. Therefore, whenever numerical integration is used an estimate of the approximation error is provided in the output list.
Note: The large number of contributors (ctb) is due to copying a lot of C/C++ code chunks from the GNU Scientific Library (GSL).
Second, we provide methods to sample from the first-passage time distribution with or without user-defined truncation from above. The first method is a new adaptive
rejection sampler building on the works of Gilks and Wild (1992;
Training DA Models Utilizing 'gips'
Extends classical linear and quadratic discriminant analysis
by incorporating permutation group symmetries into covariance matrix
estimation. The package leverages methodology from the 'gips'
framework to identify and impose permutation structures that act as a
form of regularization, improving stability and interpretability in
settings with symmetric or exchangeable features. Several discriminant
analysis variants are provided, including pooled and class-specific
covariance models, as well as multi-class extensions with shared or
independent symmetry structures. For more details about 'gips' methodology see
and Graczyk et al. (2022)
R Bindings to the 'Fstlib' Library
The 'fstlib' library provides multithreaded serialization of compressed data frames using the 'fst' format. The 'fst' format allows for random access of stored data and compression with the 'LZ4' and 'ZSTD' compressors.
Supervised Feature Selection
Interfaces for choosing important predictors in supervised
regression, classification, and censored regression models. Permuted
importance scores (Biecek and Burzykowski (2021)
Shed Light on Black Box Machine Learning Models
Shed light on black box machine learning models by the help
of model performance, variable importance, global surrogate models,
ICE profiles, partial dependence (Friedman J. H. (2001)
Simulation and Resampling Methods for Epistemic Fuzzy Data
Random simulations of fuzzy numbers are still a challenging problem. The aim of this package is to provide the respective
procedures to simulate fuzzy random variables, especially in the case of the piecewise linear fuzzy numbers (PLFNs,
see Coroianua et al. (2013)
PLINK 2 Binary (.pgen) Reader
A thin wrapper over PLINK 2's core libraries which provides an R interface for reading .pgen files. A minimal .pvar loader is also included. Chang et al. (2015) \doi{10.1186/s13742-015-0047-8}.
Efficient Serialization of R Objects
Streamlines and accelerates the process of saving and loading R objects, improving speed and compression compared to other methods. The package provides two compression formats: the 'qs2' format, which uses R serialization via the C API while optimizing compression and disk I/O, and the 'qdata' format, featuring custom serialization for slightly faster performance and better compression. Additionally, the 'qs2' format can be directly converted to the standard 'RDS' format, ensuring long-term compatibility with future versions of R.
Bayesian Hierarchical Analysis of Cognitive Models of Choice
Fit Bayesian (hierarchical) cognitive models
using a linear modeling language interface using particle Metropolis Markov
chain Monte Carlo sampling with Gibbs steps. The diffusion decision model (DDM),
linear ballistic accumulator model (LBA), racing diffusion model (RDM), and the lognormal
race model (LNR) are supported. Additionally, users can specify their own likelihood
function and/or choose for non-hierarchical
estimation, as well as for a diagonal, blocked or full multivariate normal
group-level distribution to test individual differences. Prior specification
is facilitated through methods that visualize the (implied) prior.
A wide range of plotting functions assist in assessing model convergence and
posterior inference. Models can be easily evaluated using functions
that plot posterior predictions or using relative model comparison metrics
such as information criteria or Bayes factors.
References: Stevenson et al. (2024)
Procedures Related to the Zadeh's Extension Principle for Fuzzy Data
Procedures for calculation, plotting, animation, and approximation of the outputs for fuzzy numbers (see A.I. Ban, L. Coroianu, P. Grzegorzewski "Fuzzy Numbers: Approximations, Ranking and Applications" (2015)) based on the Zadeh's Extension Principle (see de Barros, L.C., Bassanezi, R.C., Lodwick, W.A. (2017)