Family of Lasso Regression

Provide the implementation of a family of Lasso variants including Dantzig Selector, LAD Lasso, SQRT Lasso, Lq Lasso for estimating high dimensional sparse linear model. We adopt the alternating direction method of multipliers and convert the original optimization problem into a sequential L1 penalized least square minimization problem, which can be efficiently solved by linearization algorithm. A multi-stage screening approach is adopted for further acceleration. Besides the sparse linear model estimation, we also provide the extension of these Lasso variants to sparse Gaussian graphical model estimation including TIGER and CLIME using either L1 or adaptive penalty. Missing values can be tolerated for Dantzig selector and CLIME. The computation is memory-optimized using the sparse matrix output.


Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.


1.6.0 by Xingguo Li, a year ago

Browse source code at

Authors: Xingguo Li , Tuo Zhao , Lie Wang , Xiaoming Yuan , and Han Liu

Documentation:   PDF Manual  

GPL-2 license

Imports methods

Depends on lattice, MASS, Matrix, igraph

Imported by SparseTSCGM, sparsevar.

Depended on by DNetFinder, qut.

Suggested by CompareCausalNetworks, MFKnockoffs, hdme, knockoff, mlr.

See at CRAN