Examples: visualization, C++, networks, data cleaning, html widgets, ropensci.

Found 1641 packages in 0.05 seconds

orf — by Gabriel Okasa, 2 years ago

Ordered Random Forests

An implementation of the Ordered Forest estimator as developed in Lechner & Okasa (2019) . The Ordered Forest flexibly estimates the conditional probabilities of models with ordered categorical outcomes (so-called ordered choice models). Additionally to common machine learning algorithms the 'orf' package provides functions for estimating marginal effects as well as statistical inference thereof and thus provides similar output as in standard econometric models for ordered choice. The core forest algorithm relies on the fast C++ forest implementation from the 'ranger' package (Wright & Ziegler, 2017) .

grf — by Erik Sverdrup, 8 days ago

Generalized Random Forests

Forest-based statistical estimation and inference. GRF provides non-parametric methods for heterogeneous treatment effects estimation (optionally using right-censored outcomes, multiple treatment arms or outcomes, or instrumental variables), as well as least-squares regression, quantile regression, and survival regression, all with support for missing covariates.

morf — by Riccardo Di Francesco, 2 years ago

Modified Ordered Random Forest

Nonparametric estimator of the ordered choice model using random forests. The estimator modifies a standard random forest splitting criterion to build a collection of forests, each estimating the conditional probability of a single class. The package also implements a nonparametric estimator of the covariates’ marginal effects.

LongituRF — by Louis Capitaine, 4 years ago

Random Forests for Longitudinal Data

Random forests are a statistical learning method widely used in many areas of scientific research essentially for its ability to learn complex relationships between input and output variables and also its capacity to handle high-dimensional data. However, current random forests approaches are not flexible enough to handle longitudinal data. In this package, we propose a general approach of random forests for high-dimensional longitudinal data. It includes a flexible stochastic model which allows the covariance structure to vary over time. Furthermore, we introduce a new method which takes intra-individual covariance into consideration to build random forests. The method is fully detailled in Capitaine et.al. (2020) Random forests for high-dimensional longitudinal data.

ggRandomForests — by John Ehrlinger, 2 years ago

Visually Exploring Random Forests

Graphic elements for exploring Random Forests using the 'randomForest' or 'randomForestSRC' package for survival, regression and classification forests and 'ggplot2' package plotting.

pRF — by Ankur Chakravarthy, 9 years ago

Permutation Significance for Random Forests

Estimate False Discovery Rates (FDRs) for importance metrics from random forest runs.

CovRegRF — by Cansu Alakus, 4 months ago

Covariance Regression with Random Forests

Covariance Regression with Random Forests (CovRegRF) is a random forest method for estimating the covariance matrix of a multivariate response given a set of covariates. Random forest trees are built with a new splitting rule which is designed to maximize the distance between the sample covariance matrix estimates of the child nodes. The method is described in Alakus et al. (2023) . 'CovRegRF' uses 'randomForestSRC' package (Ishwaran and Kogalur, 2022) < https://cran.r-project.org/package=randomForestSRC> by freezing at the version 3.1.0. The custom splitting rule feature is utilised to apply the proposed splitting rule. The 'randomForestSRC' package implements 'OpenMP' by default, contingent upon the support provided by the target architecture and operating system. In this package, 'LAPACK' and 'BLAS' libraries are used for matrix decompositions.

rfinterval — by Haozhe Zhang, 5 years ago

Predictive Inference for Random Forests

An integrated package for constructing random forest prediction intervals using a fast implementation package 'ranger'. This package can apply the following three methods described in Haozhe Zhang, Joshua Zimmerman, Dan Nettleton, and Daniel J. Nordman (2019) : the out-of-bag prediction interval, the split conformal method, and the quantile regression forest.

RandomForestsGLS — by Arkajyoti Saha, 2 months ago

Random Forests for Dependent Data

Fits non-linear regression models on dependant data with Generalised Least Square (GLS) based Random Forest (RF-GLS) detailed in Saha, Basu and Datta (2021) .

piRF — by Chancellor Johnstone, 5 years ago

Prediction Intervals for Random Forests

Implements multiple state-of-the-art prediction interval methodologies for random forests. These include: quantile regression intervals, out-of-bag intervals, bag-of-observations intervals, one-step boosted random forest intervals, bias-corrected intervals, high-density intervals, and split-conformal intervals. The implementations include a combination of novel adjustments to the original random forest methodology and novel prediction interval methodologies. All of these methodologies can be utilized using solely this package, rather than a collection of separate packages. Currently, only regression trees are supported. Also capable of handling high dimensional data. Roy, Marie-Helene and Larocque, Denis (2019) . Ghosal, Indrayudh and Hooker, Giles (2018) . Zhu, Lin and Lu, Jiaxin and Chen, Yihong (2019) . Zhang, Haozhe and Zimmerman, Joshua and Nettleton, Dan and Nordman, Daniel J. (2019) . Meinshausen, Nicolai (2006) < http://www.jmlr.org/papers/volume7/meinshausen06a/meinshausen06a.pdf>. Romano, Yaniv and Patterson, Evan and Candes, Emmanuel (2019) . Tung, Nguyen Thanh and Huang, Joshua Zhexue and Nguyen, Thuy Thi and Khan, Imran (2014) .