tuning_univariate_algorithms.Rmd
This short tutorial teaches you how to tune parameters of univariate garch models. The function used for this type of models is the garch_reg()
function and the parameters that you can tune are the following:
arch_order: An integer giving the order of the ARCH part for the variance model.
garch_order: An integer giving the order of the GARCH part for the variance model.
ar_order: An integer giving the order of the autoregressive part for the mean model.
ma_order: An integer giving the order of the moving averages part for the mean model.
First, we load the packages:
We will use the rIBM dataset which is a tibble containing a date column and another column with the daily returns for the IBM company from 2007 to 2020. Extenderemos este dataset en el horizonte temporal tres fechas We will extend this dataset over the forecast horizon three dates and create our training and future dataset.
rIBM
#> # A tibble: 3,523 x 2
#> date daily_returns
#> <date> <dbl>
#> 1 2007-01-03 0.000926
#> 2 2007-01-04 0.0107
#> 3 2007-01-05 -0.00905
#> 4 2007-01-08 0.0152
#> 5 2007-01-09 0.0118
#> 6 2007-01-10 -0.0118
#> 7 2007-01-11 -0.00243
#> 8 2007-01-12 0.00699
#> 9 2007-01-16 0.0149
#> 10 2007-01-17 -0.00793
#> # ... with 3,513 more rows
rIBM_extended <- rIBM %>%
future_frame(.length_out = 3, .bind_data = TRUE)
rIBM_train <- rIBM_extended %>% drop_na()
rIBM_future <- rIBM_extended %>% filter(is.na(daily_returns))
Next, we define the model specification and mark which parameters we are going to tune. It is also necessary to use the tune_by
argument in which we must specify either sigmaFor or seriesFor to refer to the series forecasts or sigma forecasts (based on these forecasts will be used in the tuning process and the metrics will be calculated).
# Model Spec
model_spec <-garchmodels::garch_reg(mode = "regression",
arch_order = tune(),
garch_order = tune(),
tune_by = "sigmaFor") %>%
set_engine("rugarch")
The next step is to create a recipe in which we specify the formula we will use:
# Recipe spec
recipe_spec <- recipe(daily_returns ~ date, data = rIBM_train)
We put everything together in a workflow:
# Workflow
wflw <- workflow() %>%
add_recipe(recipe_spec) %>%
add_model(model_spec)
Next, we need to generate the resamples. As it is a time series problem, the order is important in our observations so we will use the timetk
function and visualize:
resamples <- timetk::time_series_cv(rIBM_train,
date_var = date,
initial = "6 years",
assess = "24 months",
skip = "24 months",
cumulative = TRUE,
slice_limit = 3)
timetk::plot_time_series_cv_plan(resamples, .date_var = date, .value = daily_returns)
Finally, we use the tune_grid()
function to apply the workflow on the resamples in a parallel way.
tune_results <- tune_grid(
object = wflw,
resamples = resamples,
param_info = parameters(wflw),
grid = 5,
control = control_grid(verbose = TRUE, allow_par = TRUE, parallel_over = "everything")
)
#>
#> ugarchfilter-->error: parameters names do not match specification
#> Expected Parameters are: mu ar1 ma1 omega
#>
#> ugarchfilter-->error: parameters names do not match specification
#> Expected Parameters are: mu ar1 ma1 omega alpha1 beta1
#>
#> ugarchfilter-->error: parameters names do not match specification
#> Expected Parameters are: mu ar1 ma1 omega
#>
#> ugarchfilter-->error: parameters names do not match specification
#> Expected Parameters are: mu ar1 ma1 omega
Finally, we can see which are the best results ordered by RMSE:
tune_results %>% show_best(metric = "rmse")
#> # A tibble: 3 x 8
#> arch_order garch_order .metric .estimator mean n std_err .config
#> <int> <int> <chr> <chr> <dbl> <int> <dbl> <chr>
#> 1 1 1 rmse standard 0.0195 2 0.000152 Preprocessor1~
#> 2 3 2 rmse standard 0.0212 3 0.00167 Preprocessor1~
#> 3 2 2 rmse standard 0.0212 3 0.00167 Preprocessor1~