# Statistical Learning on Sparse Matrices

Implements many algorithms for statistical learning on
sparse matrices - matrix factorizations, matrix completion,
elastic net regressions, factorization machines.
Also 'rsparse' enhances 'Matrix' package by providing methods for
multithreaded matrix products and native slicing of
the sparse matrices in Compressed Sparse Row (CSR) format.
List of the algorithms for regression problems:
1) Elastic Net regression via Follow The Proximally-Regularized Leader (FTRL)
Stochastic Gradient Descent (SGD), as per McMahan et al(, )
2) Factorization Machines via SGD, as per Rendle (2010, )
List of algorithms for matrix factorization and matrix completion:
1) Weighted Regularized Matrix Factorization (WRMF) via Alternating Least
Squares (ALS) - paper by Hu, Koren, Volinsky (2008, )
2) Maximum-Margin Matrix Factorization via ALS, paper by Rennie, Srebro
(2005, )
3) Fast Truncated Singular Value Decomposition (SVD), Soft-Thresholded SVD,
Soft-Impute matrix completion via ALS - paper by Hastie, Mazumder
et al. (2014, )
4) Linear-Flow matrix factorization, from 'Practical linear models for
large-scale one-class collaborative filtering' by Sedhain, Bui, Kawale et al
(2016, ISBN:978-1-57735-770-4)
5) GlobalVectors (GloVe) matrix factorization via SGD, paper by Pennington,
Socher, Manning (2014, < https://www.aclweb.org/anthology/D14-1162>)
Package is reasonably fast and memory efficient - it allows to work with large
datasets - millions of rows and millions of columns. This is particularly useful
for practitioners working on recommender systems.