Expanded Replacement and Extension of the 'optim' Function

Provides a replacement and extension of the optim() function to call to several function minimization codes in R in a single statement. These methods handle smooth, possibly box constrained functions of several or many parameters. Note that function 'optimr()' was prepared to simplify the incorporation of minimization codes going forward. Also implements some utility codes and some extra solvers, including safeguarded Newton methods. Many methods previously separate are now included here.


News

NEWS about R package optimx

TODO o Rvmmin gives warning when called with no gradient. Use control "usenumDeriv=TRUE" to overcome this. 130411

  o The bdmsk vector is set clumsily for Rvmmin.

VERSION 2013-10-27

  o Fix obscure bug when Inf specified for one bound only for Rvmmin.

VERSION 2013-10-21

  o Keep hessOK TRUE even if we must symmetrize Hessian.

  o Fix issue that gr=NULL did not use "grfwd" for Rvmmin, Rcgmin

VERSION 2013-08-02

  o Change startup to avoid warning messages for missing methods.
    Add function to check which methods available.

  o UOBYQA is present in optimx.run.R, but is not included in the list of
    "all.methods" in optimx.setup.R. It should be possible to run UOBYQA
    by including "uobyqa" in the list of methods in a direct call to optimx.run.R.

VERSION 2013-07-09

  o patch for missing "details" behaviour

  o point to example of use of "follow.on" control 

VERSION 2013-07-03

  o added control usenumDeriv

VERSION 2013-06-04

  o improved coef() function to extract parameters

VERSION 2013-05-03

  o removed optansout.R, get.result.R, get.best.R (no longer needed)

VERSION 2013-04-30

  o added optimx<-.coef

VERSION 2013-04-29

  o removed help pages documenting obsolete features

VERSION 2013-04-06

  o nmkb and hjkb had no ... arguments in call. Fixed.

  o L-BFGS-B (and possibly other optim() methods) return no gevals (count[[2]])
    on failure. Forced to NA in case of try-error.

VERSION 2013-04-05

  o Fixed maximize of function without gradient function. Also glitch
    (not tested or encountered) with user supplied Hessian function when
    maximizing.

  o Separate .Rd files for coef() and summary()

  o Use have.bounds to select bounds and unconstrained methods where there
    is a choice to avoid duplication of tests. This has been done for 
hjk, nmk, Rcgmin, Rvmmin

  o Revision of output description in optimx.Rd; some simplifications.

  o Parameter names preserved from starting vector, or "p1", "p2", etc. used.
    There is a test for these in the ox demo().

VERSION 2013-04-03

  o in summary.optimx no re-ordering is now specified using order = NULL

  o Fixup of examples -- simplified, corrected call to details

VERSION 2013-04-02

  o in summary.optimx the order= argument may now be an unevaluated 
    expression (as before), a string representation of that expression or 
    a character vector of names, e.g. these all work:
       order = value # unevaluated expression
       order = "value" # character string containing column name
   order = list(fevals, gevals) # unevaluated list
       order = c("fevals", "gevals") # character vector of column names
       order = list(round(value,3), fevals) # unevaluated list
       order = "list(round(value,3),fevals)" # expression as string

VERSION 2013-03-29

  o Note that asking for a method that is NOT in the results will return
    a row with rowname (the method name) as "NA". Note that this will NOT
    test TRUE with is.na() on the rowname. Use instead a test like

           (row.names(ans8missmeth["Rvmmin",])[[1]] == "NA")

  o kkt1 and kkt2 in results now returned TRUE/FALSE as in 2012 versions

  o Fixed error in kkt2 test (using old name evnhatend rather than hev)

  o Results are no longer sorted by optimx, but can be using the 'summary()'
    function. This can be called in a variety of ways. ?? see examples in??

  o The 'details' are always kept in a matrix form, even when only one
    method has been used.

  o 'convcode' is used in place of 'convergence' in result structure

  o There is a directory inst/interactive-test that contains ox.R test
    script that pauses so user can see interim results. 

  o As a result of several issues, the result structure is changed from
    that of optimx 2012 versions. It is now a data frame with a `details'
    attribute. Also an 'npar' attribute to give the number of parameters,
    and a 'maximize' attribute that is TRUE when the function is to be
    maximized.

VERSION 2013-03-28

  o print.optimx dropped. Not needed as print defaults to print.data.frame.

  o added summary.optimx which has order= and par.select= arguments

  o order= is an expression or list of expressions using the column names 
    of the "optimx" object as variables (plus the variable rownames).

  o par.select = FALSE now has same meaning as par.select = 0 (rather than 
    being an error)

  o [.optimx now subsets details attribute too

  o method column from details attribute dropped and moved to row names

VERSION 2013-03-27

  o created unit test framework with 2 unit tests in 
      inst/unitTests/runit.all.R.  
    To run: demo("unitTests")

VERSION 2013-03-25

  o [.optimx and as.data.frame.optimx added

  o coef.optimx fixed to reflect new ans.ret

VERSION 2013-03-22

  o ans.ret structure changed

VERSION 2013-03-21

  o maximize works with print(x, best.only=TRUE)

  o KKT checks can be switched off. Decided to NOT separate the code from
    optimx.run  UNTESTED

  o optimx.setup, optimx.run, optimx.check are NOT exported in NAMESPACE,
    but a knowledgeable user could use these codes directly, either from
    the source code or by modifying the package locally.

VERSION 2013-03-18

  o removed method= and columns= arguments from print.optimx and added
    best.only

  o removed print.optimx help page and moved it to the optimx help page

  o the row names of the optimx output now show the method and the method
    column has been removed

VERSION 2013-03-18

  o Removed all user accessible functions except optimx and added
    print.optimx.

VERSION 2013-3-16

  TODOS: Check the maximize flag works in all routines, 
        especially get.best.

        Splitting of optimx so preparation and loading
        is separate from running.

        all.methods = TRUE ??

  o  get.result now emits a warning if method requested is not in
     optimx solution

  o  attempt 2013-3-17 to include summary() in methods to replace
     trimpars()
OPEN ISSUES: (any date order)

091220 -- ?? kkt1 tolerance. Do we want to let this be an input control?

090612 -- ?? (Similar to above) Better choices for the tolerances in tests of equality for gradient / kkt tests?

091018 -- ?? masks?

090929 -- ?? control structure could use simplification

090729 -- ?? simplify structure of answers -- avoid repetition and make easier to access

090531 -- ?? like SNewton back in to keep Ramsay and Co. on side

090601 -- ?? Do we want hessevals to count Hessian evaluations?

IMPLEMENTED: (reverse date order)

110212 -- & to && and | to || for controls

110212 -- Hessian changed from NULL to FALSE default

100328 -- check if maximize works for Powell (minqa) routines -- 100415

100329 -- make sure spg fixed on CRAN -- 100415

100329 -- maximize tested for all but minqa, though spg needs fixup on CRAN.

100215 -- Add newuoa and uobyqa to test minqa

100212 -- ?? setting scaletol in controls ??

091018 -- new control "starttest" so we can skip them DONE 091220

091218 -- scaling checks (before computing rhobeg) and warning

091215 -- mcontrol omission in nlm() call fixed

091026 -- Remove SANN because no real conv indicator

091018 - 090531 Use function code joint with funtest and funcheck in initial checks -- not fully possible

091018 -- decided not to Put optimx.dev into package to provide for local source packages etc.

090923 -- add bobyqa (from minqa)

090923 -- add Rcgmin for large scale problems

090729 -- put hessian argument back as an alternate for kkt

090729 -- should have kkt only carried out for n<50 (no gradient), n<500 (with gradient and Jacobian

of gradient to get Hessian)

090531 -- KKT stuff

090531 -- control for keeping failed runs (save.failures)

090531 Decided to omit 'trust' from package methods for the time being.

090527 -- added follow.on

090511 What should be default method(s)?

090601: to provide compatibility with optim(), Nelder-Mead is put first.

The choice of BFGS second is open to discussion. JN

A wrapper function to integrate major optimization packages in R

For now, we only consider algorithms for unconstrained and box-constrained optimization

This function can implement "multiple" algorithms

Input:

par = a single vector of starting values

fn = objective function (assumed to be sufficeintly differentiable)

gr = name of a function to compute the (analytic) gradient of the objective function

hess = name of a function to compute the (analytic) Hessian of the objective function

Editorial note: Capitalize Hessian after the name of Otto Hesse. JN

method = a character vector denoting all the algorithms to be executed (in the specified order)

Complete names are not needed. A partial matching is attempted.

hessian = logical variable that, if present, is equivalent to control$kkt. If TRUE, it causes

optimx to try to compute an approximation to the Hessian matrix at the final set of parameters.

control = list of control information, in particular

trace = an integer controlling output (note different methods may use logicals

trace = 0 gives no output, positive numbers give more output for greater values

follow.on = TRUE or FALSE. If TRUE, and there are multiple methods, then the last set of

parameters from one method is used as the starting set for the next.

save.failures = TRUE if we wish to keep "answers" from runs where the method does not

return conv==0. FALSE otherwise (default).

maximize = TRUE if we want to maximize rather than minimize a function. (Default FALSE)

090601: Not yet implemented for nlm, nlminb, ucminf. However, there is a check to avoid

usage of these codes when maximize is TRUE.

all.methods = TRUE if we want to use all available (and suitable) methods

sort.result=TRUE, that is, we sort the results in decreasing order of the final function value

kkt=TRUE to run Kuhn, Karush, Tucker tests of results unless problem large

kkttol=0.001 (was .Machine$double.eps^(1/4)) Default value to check for small gradient and negative

Hessian eigenvalues

kkt2tol=1E-6 (default WAS 10* default for kkttol) Tolerance for eigenvalue ratio in KKT test of

positive definite Hessian

all.methods=FALSE By default we do NOT run all methods

starttests=TRUE By default we run tests of the function and parameters: feasibility relative to

bounds, analytic vs numerical gradient, scaling tests) before we try optimization methods

dowarn=TRUE By default we leave warnings generated by optimx.

badval=(0.5)*.Machine$double.xmax The value to set for the function value when try(fn()) fails.

scaletol=3 To check parameters or their bounds we take logs of absolute values and find the range

i.e., max - min. This should not exceed scaletol. A value of 3 gives magnitudes between

1 and 20 approximately.

Output:

ans = an object containing two sets of information:

essential output = a data frame of converged parameters, function value at convergence,

name of algorithm, convergence indicator, and function evaluation and gradient evaluation counts

detailed output = this is a list with one component for each algorithm, and contains parameters,

function values, convergence code, number of function and gradient evals, numerical gradient

and hessian at convergence, eigenvalues of that hessian approximation, cpu time, and other

information returned by algorithms, and name of the algorithm.

detailed output can be accessed via the attribute called `details'

Authors: Ravi Varadhan & John Nash

Date: February 17, 2008

Changes: Ravi Varadhan - Date: May 29, 2009, John Nash - Latest: Oct 18, 2009

Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.

install.packages("optimx")

2018-7.10 by John C Nash, 9 months ago


Browse source code at https://github.com/cran/optimx


Authors: John C Nash [aut, cre] , Ravi Varadhan [aut] , Gabor Grothendieck [ctb]


Documentation:   PDF Manual  


Task views: Optimization and Mathematical Programming


GPL-2 license


Imports numDeriv

Suggests knitr, rmarkdown, setRNG, BB, ucminf, minqa, dfoptim, lbfgsb3, lbfgs, subplex


Imported by CJAMP, CatDyn, CensSpatial, Countr, LifeHist, PartCensReg, QuantumClone, REndo, ROI.plugin.optimx, StempCens, Synth, calibrar, ecd, gear, invGauss, ldhmm, marked, mrds, mvord, phenofit, rankdist, spfrontier.

Depended on by LIHNPSD, NormalGamma, embryogrowth, macc, midasr, monmlp, parfm, phenology, sgt, surrosurv.

Suggested by ACDm, RandomFields, afex, bbmle, dimRed, heemod, languageR, lava, lme4, nlmrt, regsem.


See at CRAN