Feed-Forward Neural Networks and Multinomial Log-Linear Models

Software for feed-forward neural networks with a single hidden layer, and for multinomial log-linear models.


News

Software and datasets to support 'Modern Applied Statistics with S', fourth edition, by W. N. Venables and B. D. Ripley. Springer, 2002, ISBN 0-387-95457-0.

This file documents software changes since the third edition.

  • no copying of datasets even in R.
  • model.frame method for multinom (even in R).
  • nnet now uses the C interface to optim.
  • nnet.Hess has been renamed nnetHess.
  • vcov.multinom now computes the Hessian analytically (thanks to David Firth).
  • predict methods for multinom, nnet now check newdata types
  • model.frame.multinom now looks for the environment of the original formula
  • multinom has a new `model' argument defaulting to TRUE.
  • the multinom methods for add1, dropterm and anova now check for changes in the number of cases in use caused e.g. by na.action=na.omit.
  • added confint() method for multinom.
  • added logLik() method for multinom.
  • summary() for multinom now defaults to correlation=FALSE.
  • nnet() reports on 'convergence'.
  • confint.multinom() works better with a non-default 'parm'.
  • multinom() and nnet(softmax=TRUE) give an explicit error message for one-category responses.
  • the loglik() method for multinom() returns an "nobs" attribute.
  • vcov() on a "multinom" object works for fits with na.action = "na.exclude".

Reference manual

It appears you don't have a PDF plugin for this browser. You can click here to download the reference manual.

install.packages("nnet")

7.3-12 by Brian Ripley, 3 years ago


http://www.stats.ox.ac.uk/pub/MASS4/


Browse source code at https://github.com/cran/nnet


Authors: Brian Ripley [aut, cre, cph] , William Venables [cph]


Documentation:   PDF Manual  


Task views: Econometrics, Machine Learning & Statistical Learning, Statistics for the Social Sciences


GPL-2 | GPL-3 license


Depends on stats, utils

Suggests MASS


Imported by BaBooN, BaM, BayesTree, Biocomb, CARRoT, CORElearn, CoImp, DAMisc, DChaos, EffectLiteR, EnsembleBase, Frames2, GMDH2, GPSCDF, Hmisc, IsingSampler, LCAvarsel, LUCIDus, MEclustnet, MNLR, MXM, MoEClust, Modeler, NeuralNetTools, NoiseFiltersR, OptimClassifier, Plasmode, RTransProb, RVAideMemoire, RecordLinkage, SIDES, SSDM, ShinyItemAnalysis, VIM, biomod2, blkbox, brglm2, car, chemmodlab, chemometrics, corHMM, cpt, effects, exprso, fRegression, factorplot, flexmix, forecast, galgo, gencve, glm.predict, glmdisc, gnm, hmi, hmm.discnp, hybridEnsemble, ipred, ipw, isni, jmv, kgschart, logisticRR, mDAG, mExplorer, mcca, mice, mlearning, networktools, ordinalForest, pubh, pvsR, radiant.model, rasclass, reinforcelearn, rminer, selac, semiArtificial, sigQC, simPop, sparsebnUtils, spectral.methods, spls, tsDyn, tsensembler.

Depended on by BART, BarcodingR, CBPS, GDAtools, HIest, HydeNet, ImpactIV, LOGICOIL, SQB, TBFmultinomial, abc, abn, bcROCsurface, dave, depmixS4, difNLR, elect, epiDisplay, gamlss.add, gamlss.mx, gfmR, introgress, partialOR, pocrm, roughrf, sodavis, synthpop.

Suggested by AER, AICcmodavg, ALEPlot, BiodiversityR, CLME, ChemometricsWithR, DynTxRegime, ExplainPrediction, GAparsimony, GSIF, HandTill2001, MASS, MachineShop, MatchIt, MuMIn, R2HTML, ROSE, Rcmdr, RcmdrPlugin.IPSUR, RcmdrPlugin.NMBU, RcmdrPlugin.pointG, SPreFuGED, SuperLearner, VRPM, aplore3, boostr, broom, buildmer, caret, caretEnsemble, catdata, causaldrf, discSurv, e1071, fscaret, generalhoslem, glmulti, hnp, huxtable, iBreakDown, insight, lda, mboost, mi, mlDNA, mlogit, mlr, mlrMBO, mlt, mlt.docreg, nnetpredint, ordinal, pdp, performanceEstimation, plot3logit, pmml, psychomix, rattle, relimp, seqHMM, shipunov, sparklyr, sperrorest, stablelearner, validann, vcdExtra, vip.

Enhanced by emmeans, margins, prediction, stargazer, texreg.


See at CRAN