Prediction Explanation with Dependence-Aware Shapley Values

Complex machine learning models are often hard to interpret. However, in many situations it is crucial to understand and explain why a model made a specific prediction. Shapley values is the only method for such prediction explanation framework with a solid theoretical foundation. Previously known methods for estimating the Shapley values do, however, assume feature independence. This package implements the method described in Aas, Jullum and Løland (2019) , which accounts for any feature dependence, and thereby produces more accurate estimates of the true Shapley values.


News

Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.

install.packages("shapr")

0.2.0 by Martin Jullum, a month ago


https://norskregnesentral.github.io/shapr/, https://github.com/NorskRegnesentral/shapr


Report a bug at https://github.com/NorskRegnesentral/shapr/issues


Browse source code at https://github.com/cran/shapr


Authors: Nikolai Sellereite [aut] , Martin Jullum [cre, aut] , Annabelle Redelmeier [aut] , Anders Løland [ctb] , Jens Christian Wahl [ctb] , Camilla Lingjærde [ctb] , Norsk Regnesentral [cph, fnd]


Documentation:   PDF Manual  


MIT + file LICENSE license


Imports stats, data.table, Rcpp, condMVNorm, mvnfast, Matrix

Suggests ranger, xgboost, mgcv, testthat, knitr, rmarkdown, roxygen2, MASS, ggplot2, caret, gbm, party, partykit

Linking to RcppArmadillo, Rcpp


See at CRAN