Found 3056 packages in 0.05 seconds
Conformal Time Series Forecasting Using State of Art Machine Learning Algorithms
Conformal time series forecasting using the caret infrastructure. It provides access to state-of-the-art machine learning models for forecasting applications. The hyperparameter of each model is selected based on time series cross-validation, and forecasting is done recursively.
Robust Covariance Matrix Estimators
Object-oriented software for model-robust covariance matrix estimators. Starting out from the basic
robust Eicker-Huber-White sandwich covariance methods include: heteroscedasticity-consistent (HC)
covariances for cross-section data; heteroscedasticity- and autocorrelation-consistent (HAC)
covariances for time series data (such as Andrews' kernel HAC, Newey-West, and WEAVE estimators);
clustered covariances (one-way and multi-way); panel and panel-corrected covariances;
outer-product-of-gradients covariances; and (clustered) bootstrap covariances. All methods are
applicable to (generalized) linear model objects fitted by lm() and glm() but can also be adapted
to other classes through S3 methods. Details can be found in Zeileis et al. (2020)
Resampling Tools for Time Series Forecasting
A 'modeltime' extension that implements forecast resampling tools that assess time-based model performance and stability for a single time series, panel data, and cross-sectional time series analysis.
Forecasting Time Series by Theta Models
Routines for forecasting univariate time series using Theta Models.
Smoothing Long-Memory Time Series
The nonparametric trend and its derivatives in equidistant time
series (TS) with long-memory errors can be estimated. The
estimation is conducted via local polynomial regression using an
automatically selected bandwidth obtained by a built-in iterative plug-in
algorithm or a bandwidth fixed by the user.
The smoothing methods of the package are described in Letmathe, S., Beran,
J. and Feng, Y., (2023)
Multivariate Time Series Data Imputation
This is an EM algorithm based method for imputation of missing values in multivariate normal time series. The imputation algorithm accounts for both spatial and temporal correlation structures. Temporal patterns can be modeled using an ARIMA(p,d,q), optionally with seasonal components, a non-parametric cubic spline or generalized additive models with exogenous covariates. This algorithm is specially tailored for climate data with missing measurements from several monitors along a given region.
Methods for Temporal Disaggregation and Interpolation of Time Series
Temporal disaggregation methods are used to disaggregate and
interpolate a low frequency time series to a higher frequency series, where
either the sum, the mean, the first or the last value of the resulting
high frequency series is consistent with the low frequency series. Temporal
disaggregation can be performed with or without one or more high frequency
indicator series. Contains the methods of Chow-Lin, Santos-Silva-Cardoso,
Fernandez, Litterman, Denton and Denton-Cholette, summarized in Sax and
Steiner (2013)
Deep Learning Model for Time Series Forecasting
RNNs are preferred for sequential data like time series, speech, text, etc. but when dealing with long range dependencies, vanishing gradient problems account for their poor performance. LSTM and GRU are effective solutions which are nothing but RNN networks with the abilities of learning both short-term and long-term dependencies. Their structural makeup enables them to remember information for a long period without any difficulty. LSTM consists of one cell state and three gates, namely, forget gate, input gate and output gate whereas GRU comprises only two gates, namely, reset gate and update gate. This package consists of three different functions for the application of RNN, LSTM and GRU to any time series data for its forecasting. For method details see Jaiswal, R. et al. (2022).
Rmetrics - Autoregressive Conditional Heteroskedastic Modelling
Analyze and model heteroskedastic behavior in financial time series.
Measuring Information Flow Between Time Series with Shannon and Renyi Transfer Entropy
Measuring information flow between time series with Shannon and Rényi transfer entropy. See also Dimpfl and Peter (2013)