Found 1641 packages in 0.02 seconds
Prediction Intervals for Random Forests
Implements multiple state-of-the-art prediction interval methodologies for random forests.
These include: quantile regression intervals, out-of-bag intervals, bag-of-observations intervals,
one-step boosted random forest intervals, bias-corrected intervals, high-density intervals, and
split-conformal intervals. The implementations include a combination of novel adjustments to the
original random forest methodology and novel prediction interval methodologies. All of these
methodologies can be utilized using solely this package, rather than a collection of separate
packages. Currently, only regression trees are supported. Also capable of handling high dimensional data.
Roy, Marie-Helene and Larocque, Denis (2019)
Handwriting Analysis with Random Forests
Perform forensic handwriting analysis of two scanned handwritten documents. This package implements the statistical method described by Madeline Johnson and Danica Ommen (2021)
Random Forest Cluster Analysis
Tools to perform random forest consensus clustering of different data types. The package is designed to accept a list of matrices from different assays, typically from high-throughput molecular profiling so that class discovery may be jointly performed. For references, please see Tao Shi & Steve Horvath (2006)
Interactive Visualization Tool for Random Forests
An interactive data visualization and exploration toolkit that implements Breiman and Cutler's original random forest Java based visualization tools in R, for supervised and unsupervised classification and regression within the algorithm random forest.
Random Forest with Multivariate Longitudinal Predictors
Based on random forest principle, 'DynForest' is able to include
multiple longitudinal predictors to provide individual predictions.
Longitudinal predictors are modeled through the random forest. The
methodology is fully described for a survival outcome in:
Devaux, Helmer, Genuer & Proust-Lima (2023)
Weighted Subspace Random Forest for Classification
A parallel implementation of Weighted Subspace Random Forest. The
Weighted Subspace Random Forest algorithm was proposed in the
International Journal of Data Warehousing and Mining by Baoxun Xu,
Joshua Zhexue Huang, Graham Williams, Qiang Wang, and Yunming Ye
(2012)
Tune Random Forest of the 'ranger' Package
Tuning random forest with one line. The package is mainly based on the packages 'ranger' and 'mlrMBO'.
An Implementation of the Hedged Random Forest Algorithm
This algorithm is described in detail in the paper "Hedging Forecast Combinations With an Application to the Random Forest" by Beck et al. (2023)
Random Forest Two-Sample Tests
An implementation of Random Forest-based two-sample tests as introduced in Hediger & Michel & Naef (2022).
Find the Outlier by Quantile Random Forests
Provides a method to find the outlier in custom data by quantile random forests method. Introduced by Meinshausen Nicolai (2006) < https://dl.acm.org/doi/10.5555/1248547.1248582>. It directly calls the ranger() function of the 'ranger' package to perform data fitting and prediction. We also implement the evaluation of outlier prediction results. Compared with random forest detection of outliers, this method has higher accuracy and stability on large datasets.