# Stepwise Regression Analysis

Stepwise regression analysis for variable selection can be used to get the best candidate final regression model with the forward selection, backward elimination and bidirectional elimination approaches. Best subset selection fit a separate least squares regression for each possible combination of all predictors. Both the above two procedures in this package can use weighted data to get best regression model in univariate regression and multivariate regression analysis(Alsubaihi, A. A., (2002) ). And continuous variables nested within class effect is also considered in both two procedures. Also stepwise logistic regression in this package can performed with binary dependent variable(Agresti, A. (1984) and Agresti, A. (2014) ). A widely used selection criteria are available which includes Akaike information criterion(Darlington, R. B. (1968) , Judge, G. G. (1985) ), corrected Akaike information criterion(Hurvich, C. M., and Tsai, C. (1989) ), Bayesian information criterion(Sawa, T. (1978) , Judge, G. G. (1985) ), Mallows Cp statistic(Mallows, C. L. (1973) , Hocking, R. R. (1976) ), Hannan and Quinn information criterion(Hannan, E. J. and Quinn, B. G. (1979) , Mcquarrie, A. D. R. and Tsai, C. L. (1998) ), corrected Hannan and Quinn information criterion(Mcquarrie, A. D. R. and Tsai, C. L. (1998) ), Schwarz criterion(Schwarz, G. (1978) , Judge, G. G. (1985) ), adjusted R-square statistic(Darlington, R. B. (1968) , Judge, G. G. (1985) ) and significance levels(Mckeon, J. J. (1974) , Harold Hotelling. (1992) , Pillai, K. C. S. (2006) ), where multicollinearity can be detected with checking tolerance value.