Optimization algorithms implemented in R, including conjugate gradient (CG), Broyden-Fletcher-Goldfarb-Shanno (BFGS) and the limited memory BFGS (L-BFGS) methods. Most internal parameters can be set through the call interface. The solvers hold up quite well for higher-dimensional problems.
method = "TN"). Can be controlled using the
method = "SR1"), falling back to the BFGS direction if a descent direction is not found.
preconditioner, which applies to the conjugate gradient and truncated newton methods. The only value currently available is
preconditioner = "L-BFGS"which uses L-BFGS to estimate the inverse Hessian for preconditioning. The number of updates to store for this preconditioner is controlled by the
memoryparameter, just as if you were using
method = "L-BFGS".
fglist, supply a function
hi, that takes the
parvector as input. The function can return a matrix (obviously not a great idea for memory use), or a vector, the latter of which is assumed to be the diagonal of the matrix.
line_search = "More-Thuente"only): sets maximum value of alpha that can be attained during line search.
ls_max_alpha_mult(for Wolfe-type line search only): sets maximum value that can be attained by the ratio of the initial guess for alpha for the current line search, to the final value of alpha of the previous line search. Used to stop line searches diverging due to very large initial guesses.
line_search = "More-Thuente"only): if
TRUE, use the safe-guarded cubic modification suggested by Xie and Schlick.
cg_update = "prfr", the "PR-FR" (Polak-Ribiere/Fletcher-Reeves) conjugate gradient update suggested by Gilbert and Nocedal.
cg_udpate = "hs"), Conjugate Descent (
cg_udpate = "cd"), Dai-Yuan ((
cg_udpate = "dy") and Liu-Storey (
cg_udpate = "ls").