Found 7208 packages in 0.02 seconds
Generalized Linear Mixed Models using Adaptive Gaussian Quadrature
Fits generalized linear mixed models for a single grouping factor under
maximum likelihood approximating the integrals over the random effects with an
adaptive Gaussian quadrature rule; Jose C. Pinheiro and Douglas M. Bates (1995)
Fast Functional Mixed Models using Fast Univariate Inference
Implementation of the fast univariate inference approach (Cui et al. (2022)
High Dimensional Penalized Generalized Linear Mixed Models (pGLMM)
Fits high dimensional penalized generalized linear
mixed models using
the Monte Carlo Expectation Conditional Minimization (MCECM) algorithm.
The purpose of the package is to perform variable selection on both the fixed and
random effects simultaneously for generalized linear mixed models.
The package supports fitting of Binomial, Gaussian, and Poisson data with canonical links, and
supports penalization using the MCP, SCAD, or LASSO penalties. The MCECM algorithm
is described in Rashid et al. (2020)
Build Network Based on Linear Mixed Models from EHRs
Analyzing longitudinal clinical data from Electronic Health Records (EHRs) using linear mixed models (LMM) and visualizing the results as networks. It includes functions for fitting LMM, normalizing adjacency matrices, and comparing networks. The package is designed for researchers in clinical and biomedical fields who need to model longitudinal data and explore relationships between variables For more details see Bates et al. (2015)
Spaghetti-Plot Fixed and Random Effects of Linear Mixed Models
Plot both fixed and random effects of linear mixed models, multilevel models in a single spaghetti plot. The package allows to visualize the effect of a predictor on a criterion between different levels of a grouping variable. Additionally, confidence intervals can be displayed for fixed effects. Calculation of predicted values of random effects allows only models with one random intercept and/or one random slope to be plotted. Confidence intervals and predicted values of fixed effects are computed using the 'ggpredict' function from the 'ggeffects' package. Lüdecke, D. (2018)
Multivariate Generalized Linear Mixed Models for Ranking Sports Teams
Maximum likelihood estimates are obtained via an EM algorithm with either a first-order or a fully exponential Laplace approximation as documented by Broatch and Karl (2018)
Flexible Tree Taper Curves Based on Semiparametric Mixed Models
Implementation of functions for fitting taper curves (a semiparametric
linear mixed effects taper model) to diameter measurements along stems. Further
functions are provided to estimate the uncertainty around the predicted curves,
to calculate timber volume (also by sections) and marginal (e.g., upper) diameters.
For cases where tree heights are not measured, methods for estimating
additional variance in volume predictions resulting from uncertainties in
tree height models (tariffs) are provided. The example data include the taper
curve parameters for Norway spruce used in the 3rd German NFI fitted to 380 trees
and a subset of section-wise diameter measurements of these trees. The functions
implemented here are detailed in Kublin, E., Breidenbach, J., Kaendler, G. (2013)
Confidence Intervals for Robust and Classical Linear Mixed Model Estimators
The main function calculates confidence intervals (CI) for Mixed Models, utilizing both classical estimators from the lmer() function in the 'lme4' package and robust estimators from the rlmer() function in the 'robustlmm' package, as well as the varComprob() function in the 'robustvarComp' package. Three methods are available: the classical Wald method, the wild bootstrap, and the parametric bootstrap. Bootstrap methods offer flexibility in obtaining lower and upper bounds through percentile or BCa methods. More details are given in Mason, F., Cantoni, E., & Ghisletta, P. (2021)
Lasso, Group Lasso, and Sparse-Group Lasso for Mixed Models
Proximal gradient descent solver for the operators lasso, (fitted) group lasso, and (fitted) sparse-group lasso. The implementation involves backtracking line search and warm starts. Input data needs to be clustered/ grouped for each group lasso variant before calling these algorithms.
Automating the Fitting of Double Linear Mixed Models in 'JAGS' and 'nimble'
Automates fitting of double GLM in 'JAGS'. Includes automatic
generation of 'JAGS' scripts, running 'JAGS' or 'nimble' via the 'rjags'
and 'nimble' package, and summarizing the resulting output. For further
information see Bonner, Kim, Westneat, Mutzel, Wright, and Schofield