Page 1

Displaying 1 – 16 of 16

Showing per page

Periodic autoregression with exogenous variables and periodic variances

Jiří Anděl (1989)

Aplikace matematiky

The periodic autoregressive process with non-vanishing mean and with exogenous variables is investigated in the paper. It is assumed that the model has also periodic variances. The statistical analysis is based on the Bayes approach with a vague prior density. Estimators of the parameters and asymptotic tests of hypotheses are derived.

Periodic moving average process

Tomáš Cipra (1985)

Aplikace matematiky

Periodic moving average processes are representatives of the class of periodic models suitable for the description of some seasonal time series and for the construction of multivariate moving average models. The attention having been lately concentrated mainly on periodic autoregressions, some methods of statistical analysis of the periodic moving average processes are suggested in the paper. These methods include the estimation procedure (based on Durbin's construction of the parameter estimators...

Plug-in estimators for higher-order transition densities in autoregression

Anton Schick, Wolfgang Wefelmeyer (2009)

ESAIM: Probability and Statistics

In this paper we obtain root-n consistency and functional central limit theorems in weighted L1-spaces for plug-in estimators of the two-step transition density in the classical stationary linear autoregressive model of order one, assuming essentially only that the innovation density has bounded variation. We also show that plugging in a properly weighted residual-based kernel estimator for the unknown innovation density improves on plugging in an unweighted residual-based kernel estimator....

Poisson sampling for spectral estimation in periodically correlated processes

Vincent Monsan (1994)

Applicationes Mathematicae

We study estimation problems for periodically correlated, non gaussian processes. We estimate the correlation functions and the spectral densities from continuous-time samples. From a random time sample, we construct three types of estimators for the spectral densities and we prove their consistency.

Prediction of time series by statistical learning: general losses and fast rates

Pierre Alquier, Xiaoyin Li, Olivier Wintenberger (2013)

Dependence Modeling

We establish rates of convergences in statistical learning for time series forecasting. Using the PAC-Bayesian approach, slow rates of convergence √ d/n for the Gibbs estimator under the absolute loss were given in a previous work [7], where n is the sample size and d the dimension of the set of predictors. Under the same weak dependence conditions, we extend this result to any convex Lipschitz loss function. We also identify a condition on the parameter space that ensures similar rates for the...

Prediction problems related to a first-order autoregressive process in the presence of outliers

Sugata Sen Roy, Sourav Chakraborty (2006)

Applicationes Mathematicae

Outliers in a time series often cause problems in fitting a suitable model to the data. Hence predictions based on such models are liable to be erroneous. In this paper we consider a stable first-order autoregressive process and suggest two methods of substituting an outlier by imputed values and then predicting on the basis of it. The asymptotic properties of both the process parameter estimators and the predictors are also studied.

Probabilistic properties of a Markov-switching periodic G A R C H process

Billel Aliat, Fayçal Hamdi (2019)

Kybernetika

In this paper, we propose an extension of a periodic G A R C H ( P G A R C H ) model to a Markov-switching periodic G A R C H ( M S - P G A R C H ), and provide some probabilistic properties of this class of models. In particular, we address the question of strictly periodically...

Currently displaying 1 – 16 of 16

Page 1