Examples: visualization, C++, networks, data cleaning, html widgets, ropensci.

Found 1643 packages in 0.04 seconds

rfUtilities — by Jeffrey S. Evans, 5 years ago

Random Forests Model Selection and Performance Evaluation

Utilities for Random Forest model selection, class balance correction, significance test, cross validation and partial dependency plots.

m2b — by Laurent Dubroca, 8 years ago

Movement to Behaviour Inference using Random Forest

Prediction of behaviour from movement characteristics using observation and random forest for the analyses of movement data in ecology. From movement information (speed, bearing...) the model predicts the observed behaviour (movement, foraging...) using random forest. The model can then extrapolate behavioural information to movement data without direct observation of behaviours. The specificity of this method relies on the derivation of multiple predictor variables from the movement data over a range of temporal windows. This procedure allows to capture as much information as possible on the changes and variations of movement and ensures the use of the random forest algorithm to its best capacity. The method is very generic, applicable to any set of data providing movement data together with observation of behaviour.

CALIBERrfimpute — by Anoop Shah, 2 years ago

Multiple Imputation Using MICE and Random Forest

Functions to impute using random forest under full conditional specifications (multivariate imputation by chained equations). The methods are described in Shah and others (2014) .

miceRanger — by Sam Wilson, 3 years ago

Multiple Imputation by Chained Equations with Random Forests

Multiple Imputation has been shown to be a flexible method to impute missing values by Van Buuren (2007) . Expanding on this, random forests have been shown to be an accurate model by Stekhoven and Buhlmann to impute missing values in datasets. They have the added benefits of returning out of bag error and variable importance estimates, as well as being simple to run in parallel.

literanger — by Stephen Wade, 2 months ago

Random Forests for Multiple Imputation Based on 'ranger'

An updated implementation of R package 'ranger' by Wright et al, (2017) for training and predicting from random forests, particularly suited to high-dimensional data, and for embedding in 'Multiple Imputation by Chained Equations' (MICE) by van Buuren (2007) . Ensembles of classification and regression trees are currently supported. Sparse data of class 'dgCMatrix' (R package 'Matrix') can be directly analyzed. Conventional bagged predictions are available alongside an efficient prediction for MICE via the algorithm proposed by Doove et al (2014) . Survival and probability forests are not supported in the update, nor is data of class 'gwaa.data' (R package 'GenABEL'); use the original 'ranger' package for these analyses.

abcrf — by Jean-Michel Marin, 2 years ago

Approximate Bayesian Computation via Random Forests

Performs Approximate Bayesian Computation (ABC) model choice and parameter inference via random forests. Pudlo P., Marin J.-M., Estoup A., Cornuet J.-M., Gautier M. and Robert C. P. (2016) . Estoup A., Raynal L., Verdu P. and Marin J.-M. < http://journal-sfds.fr/article/view/709>. Raynal L., Marin J.-M., Pudlo P., Ribatet M., Robert C. P. and Estoup A. (2019) .

steprf — by Jin Li, 2 years ago

Stepwise Predictive Variable Selection for Random Forest

An introduction to several novel predictive variable selection methods for random forest. They are based on various variable importance methods (i.e., averaged variable importance (AVI), and knowledge informed AVI (i.e., KIAVI, and KIAVI2)) and predictive accuracy in stepwise algorithms. For details of the variable selection methods, please see: Li, J., Siwabessy, J., Huang, Z. and Nichol, S. (2019) . Li, J., Alvarez, B., Siwabessy, J., Tran, M., Huang, Z., Przeslawski, R., Radke, L., Howard, F., Nichol, S. (2017). .

moreparty — by Nicolas Robette, a year ago

A Toolbox for Conditional Inference Trees and Random Forests

Additions to 'party' and 'partykit' packages : tools for the interpretation of forests (surrogate trees, prototypes, etc.), feature selection (see Gregorutti et al (2017) , Hapfelmeier and Ulm (2013) , Altmann et al (2010) ) and parallelized versions of conditional forest and variable importance functions. Also modules and a shiny app for conditional inference trees.

MulvariateRandomForestVarImp — by Dogonadze Nika, 3 years ago

Variable Importance Measures for Multivariate Random Forests

Calculates two sets of post-hoc variable importance measures for multivariate random forests. The first set of variable importance measures are given by the sum of mean split improvements for splits defined by feature j measured on user-defined examples (i.e., training or testing samples). The second set of importance measures are calculated on a per-outcome variable basis as the sum of mean absolute difference of node values for each split defined by feature j measured on user-defined examples (i.e., training or testing samples). The user can optionally threshold both sets of importance measures to include only splits that are statistically significant as measured using an F-test.

forestError — by Benjamin Lu, 3 years ago

A Unified Framework for Random Forest Prediction Error Estimation

Estimates the conditional error distributions of random forest predictions and common parameters of those distributions, including conditional misclassification rates, conditional mean squared prediction errors, conditional biases, and conditional quantiles, by out-of-bag weighting of out-of-bag prediction errors as proposed by Lu and Hardin (2021). This package is compatible with several existing packages that implement random forests in R.