R/parsnip-prophet_boost.R
boost_prophet.Rd
boost_prophet()
is a way to generate a specification of a Boosted PROPHET model
before fitting and allows the model to be created using
different packages. Currently the only package is prophet
.
boost_prophet( mode = "regression", growth = NULL, changepoint_num = NULL, changepoint_range = NULL, seasonality_yearly = NULL, seasonality_weekly = NULL, seasonality_daily = NULL, season = NULL, prior_scale_changepoints = NULL, prior_scale_seasonality = NULL, prior_scale_holidays = NULL, logistic_cap = NULL, logistic_floor = NULL, tree_depth = NULL, learn_rate = NULL, mtry = NULL, trees = NULL, min_n = NULL, sample_size = NULL, loss_reduction = NULL )
mode | A single character string for the type of model. The only possible value for this model is "regression". |
---|---|
growth | String 'linear' or 'logistic' to specify a linear or logistic trend. |
changepoint_num | Number of potential changepoints to include for modeling trend. |
changepoint_range | Adjusts the flexibility of the trend component by limiting to a percentage of data before the end of the time series. 0.80 means that a changepoint cannot exist after the first 80% of the data. |
seasonality_yearly | One of "auto", TRUE or FALSE. Toggles on/off a seasonal component that models year-over-year seasonality. |
seasonality_weekly | One of "auto", TRUE or FALSE. Toggles on/off a seasonal component that models week-over-week seasonality. |
seasonality_daily | One of "auto", TRUE or FALSE. Toggles on/off a seasonal componet that models day-over-day seasonality. |
season | 'additive' (default) or 'multiplicative'. |
prior_scale_changepoints | Parameter modulating the flexibility of the automatic changepoint selection. Large values will allow many changepoints, small values will allow few changepoints. |
prior_scale_seasonality | Parameter modulating the strength of the seasonality model. Larger values allow the model to fit larger seasonal fluctuations, smaller values dampen the seasonality. |
prior_scale_holidays | Parameter modulating the strength of the holiday components model, unless overridden in the holidays input. |
logistic_cap | When growth is logistic, the upper-bound for "saturation". |
logistic_floor | When growth is logistic, the lower-bound for "saturation". |
tree_depth | The maximum depth of the tree (i.e. number of splits). |
learn_rate | The rate at which the boosting algorithm adapts from iteration-to-iteration. |
mtry | The number of predictors that will be randomly sampled at each split when creating the tree models. |
trees | The number of trees contained in the ensemble. |
min_n | The minimum number of data points in a node that is required for the node to be split further. |
sample_size | The amount of data exposed to the fitting routine. |
loss_reduction | The reduction in the loss function required to split further. |
The data given to the function are not saved and are only used
to determine the mode of the model. For boost_prophet()
, the
mode will always be "regression".
The model can be created using the fit()
function using the
following engines:
"prophet_catboost" (default) - Connects to prophet::prophet()
and catboost::catboost.train()
"prophet_lightgbm" - Connects to prophet::prophet()
and lightgbm::lgb.train()
Main Arguments
The main arguments (tuning parameters) for the PROPHET model are:
growth
: String 'linear' or 'logistic' to specify a linear or logistic trend.
changepoint_num
: Number of potential changepoints to include for modeling trend.
changepoint_range
: Range changepoints that adjusts how close to the end
the last changepoint can be located.
season
: 'additive' (default) or 'multiplicative'.
prior_scale_changepoints
: Parameter modulating the flexibility of the
automatic changepoint selection. Large values will allow many changepoints,
small values will allow few changepoints.
prior_scale_seasonality
: Parameter modulating the strength of the
seasonality model. Larger values allow the model to fit larger seasonal
fluctuations, smaller values dampen the seasonality.
prior_scale_holidays
: Parameter modulating the strength of the holiday components model,
unless overridden in the holidays input.
The main arguments (tuning parameters) for the model Catboost/LightGBM model are:
tree_depth
: The maximum depth of the tree (i.e. number of splits).
learn_rate
: The rate at which the boosting algorithm adapts from iteration-to-iteration.
mtry
: The number of predictors that will be randomly sampled at each split when creating the tree models.
trees
: The number of trees contained in the ensemble.
min_n
: The minimum number of data points in a node that is required for the node to be split further.
sample_size
: The amount of data exposed to the fitting routine.
loss_reduction
: The reduction in the loss function required to split further.
These arguments are converted to their specific names at the time that the model is fit.
Other options and argument can be
set using set_engine()
(See Engine Details below).
If parameters need to be modified, update()
can be used
in lieu of recreating the object from scratch.
The standardized parameter names in boostime
can be mapped to their original
names in each engine:
Model 1: PROPHET:
boostime | prophet |
growth | growth ('linear') |
changepoint_num | n.changepoints (25) |
changepoint_range | changepoints.range (0.8) |
seasonality_yearly | yearly.seasonality ('auto') |
seasonality_weekly | weekly.seasonality ('auto') |
seasonality_daily | daily.seasonality ('auto') |
season | seasonality.mode ('additive') |
prior_scale_changepoints | changepoint.prior.scale (0.05) |
prior_scale_seasonality | seasonality.prior.scale (10) |
prior_scale_holidays | holidays.prior.scale (10) |
logistic_cap | df$cap (NULL) |
logistic_floor | df$floor (NULL) |
Model 2: Catboost / LightGBM:
boostime | catboost::catboost.train | lightgbm::lgb.train |
tree_depth | depth | max_depth |
learn_rate | learning_rate | learning_rate |
mtry | rsm | feature_fraction |
trees | iterations | num_iterations |
min_n | min_data_in_leaf | min_data_in_leaf |
loss_reduction | None | min_gain_to_split |
sample_size | subsample | bagging_fraction |
Other options can be set using set_engine()
.
prophet_catboost
Model 1: PROPHET (prophet::prophet
):
## function (df = NULL, growth = "linear", changepoints = NULL, n.changepoints = 25, ## changepoint.range = 0.8, yearly.seasonality = "auto", weekly.seasonality = "auto", ## daily.seasonality = "auto", holidays = NULL, seasonality.mode = "additive", ## seasonality.prior.scale = 10, holidays.prior.scale = 10, changepoint.prior.scale = 0.05, ## mcmc.samples = 0, interval.width = 0.8, uncertainty.samples = 1000, ## fit = TRUE, ...)
Parameter Notes:
df
: This is supplied via the parsnip / boostime fit()
interface
(so don't provide this manually). See Fit Details (below).
holidays
: A data.frame of holidays can be supplied via set_engine()
uncertainty.samples
: The default is set to 0 because the prophet
uncertainty intervals are not used as part of the Modeltime Workflow.
You can override this setting if you plan to use prophet's uncertainty tools.
Logistic Growth and Saturation Levels:
For growth = "logistic"
, simply add numeric values for logistic_cap
and / or
logistic_floor
. There is no need to add additional columns
for "cap" and "floor" to your data frame.
Limitations:
prophet::add_seasonality()
is not currently implemented. It's used to
specify non-standard seasonalities using fourier series. An alternative is to use
step_fourier()
and supply custom seasonalities as Extra Regressors.
Model 2: Catboost (catboost::catboost.train
):
## function (learn_pool, test_pool = NULL, params = list())
Parameter Notes:
Catboost uses a params = list()
to capture.
Parsnip / Timeboost automatically sends any args provided as ...
inside of set_engine()
to
the params = list(...)
.
prophet_lightgbm
Model 1: PROPHET (prophet::prophet
):
## function (df = NULL, growth = "linear", changepoints = NULL, n.changepoints = 25, ## changepoint.range = 0.8, yearly.seasonality = "auto", weekly.seasonality = "auto", ## daily.seasonality = "auto", holidays = NULL, seasonality.mode = "additive", ## seasonality.prior.scale = 10, holidays.prior.scale = 10, changepoint.prior.scale = 0.05, ## mcmc.samples = 0, interval.width = 0.8, uncertainty.samples = 1000, ## fit = TRUE, ...)
Parameter Notes:
df
: This is supplied via the parsnip / boostime fit()
interface
(so don't provide this manually). See Fit Details (below).
holidays
: A data.frame of holidays can be supplied via set_engine()
uncertainty.samples
: The default is set to 0 because the prophet
uncertainty intervals are not used as part of the Modeltime Workflow.
You can override this setting if you plan to use prophet's uncertainty tools.
Logistic Growth and Saturation Levels:
For growth = "logistic"
, simply add numeric values for logistic_cap
and / or
logistic_floor
. There is no need to add additional columns
for "cap" and "floor" to your data frame.
Limitations:
prophet::add_seasonality()
is not currently implemented. It's used to
specify non-standard seasonalities using fourier series. An alternative is to use
step_fourier()
and supply custom seasonalities as Extra Regressors.
Model 2: Lightgbm (catboost::catboost.train
):
## function (params = list(), data, nrounds = 10L, valids = list(), obj = NULL, ## eval = NULL, verbose = 1L, record = TRUE, eval_freq = 1L, init_model = NULL, ## colnames = NULL, categorical_feature = NULL, early_stopping_rounds = NULL, ## callbacks = list(), reset_data = FALSE, ...)
Parameter Notes:
Lightgbm uses a params = list()
to capture.
Parsnip / Timeboost automatically sends any args provided as ...
inside of set_engine()
to
the params = list(...)
.
Date and Date-Time Variable
It's a requirement to have a date or date-time variable as a predictor.
The fit()
interface accepts date and date-time features and handles them internally.
Univariate (No Extra Regressors):
For univariate analysis, you must include a date or date-time feature. Simply use:
Formula Interface (recommended): fit(y ~ date)
will ignore xreg's.
Multivariate (Extra Regressors)
Extra Regressors parameter is populated using the fit()
or fit_xy()
function:
Only factor
, ordered factor
, and numeric
data will be used as xregs.
Date and Date-time variables are not used as xregs
character
data should be converted to factor.
Xreg Example: Suppose you have 3 features:
y
(target)
date
(time stamp),
month.lbl
(labeled month as a ordered factor).
The month.lbl
is an exogenous regressor that can be passed to the arima_reg()
using
fit()
:
fit(y ~ date + month.lbl)
will pass month.lbl
on as an exogenous regressor.
Note that date or date-time class values are excluded from xreg
.
library(dplyr) library(lubridate) library(parsnip) library(rsample) library(timetk) library(boostime) # Data m750 <- m4_monthly %>% filter(id == "M750") m750#> # A tibble: 306 x 3 #> id date value #> <fct> <date> <dbl> #> 1 M750 1990-01-01 6370 #> 2 M750 1990-02-01 6430 #> 3 M750 1990-03-01 6520 #> 4 M750 1990-04-01 6580 #> 5 M750 1990-05-01 6620 #> 6 M750 1990-06-01 6690 #> 7 M750 1990-07-01 6000 #> 8 M750 1990-08-01 5450 #> 9 M750 1990-09-01 6480 #> 10 M750 1990-10-01 6820 #> # ... with 296 more rows# Split Data 80/20 splits <- initial_time_split(m750, prop = 0.8) # ---- PROPHET ---- # Model Spec model_spec <- boost_prophet( learn_rate = 0.1 ) %>% set_engine("prophet_catboost") # Fit Spec model_fit <- model_spec %>% fit(log(value) ~ date + as.numeric(date) + month(date, label = TRUE), data = training(splits))#>#>#> 0: learn: 0.0184749 total: 294us remaining: 294ms #> 1: learn: 0.0184320 total: 2.98ms remaining: 1.49s #> 2: learn: 0.0183684 total: 5.19ms remaining: 1.72s #> 3: learn: 0.0183684 total: 5.41ms remaining: 1.35s #> 4: learn: 0.0182161 total: 13.7ms remaining: 2.72s #> 5: learn: 0.0181648 total: 17.1ms remaining: 2.83s #> 6: learn: 0.0181046 total: 23.4ms remaining: 3.31s #> 7: learn: 0.0180918 total: 29.1ms remaining: 3.6s #> 8: learn: 0.0180918 total: 29.3ms remaining: 3.23s #> 9: learn: 0.0180779 total: 31.3ms remaining: 3.1s #> 10: learn: 0.0179316 total: 32.1ms remaining: 2.88s #> 11: learn: 0.0179065 total: 34.1ms remaining: 2.8s #> 12: learn: 0.0179064 total: 34.3ms remaining: 2.6s #> 13: learn: 0.0178947 total: 36.1ms remaining: 2.54s #> 14: learn: 0.0178787 total: 39.6ms remaining: 2.6s #> 15: learn: 0.0178756 total: 41.8ms remaining: 2.57s #> 16: learn: 0.0178302 total: 44.1ms remaining: 2.55s #> 17: learn: 0.0178293 total: 46.1ms remaining: 2.52s #> 18: learn: 0.0177993 total: 48.5ms remaining: 2.5s #> 19: learn: 0.0177607 total: 54.1ms remaining: 2.65s #> 20: learn: 0.0177607 total: 54.3ms remaining: 2.53s #> 21: learn: 0.0177601 total: 56.2ms remaining: 2.5s #> 22: learn: 0.0177601 total: 56.4ms remaining: 2.4s #> 23: learn: 0.0176265 total: 60.4ms remaining: 2.46s #> 24: learn: 0.0176260 total: 62.6ms remaining: 2.44s #> 25: learn: 0.0176260 total: 62.8ms remaining: 2.35s #> 26: learn: 0.0176260 total: 63ms remaining: 2.27s #> 27: learn: 0.0176240 total: 65ms remaining: 2.25s #> 28: learn: 0.0176240 total: 65.2ms remaining: 2.18s #> 29: learn: 0.0176209 total: 105ms remaining: 3.39s #> 30: learn: 0.0176209 total: 106ms remaining: 3.31s #> 31: learn: 0.0176209 total: 106ms remaining: 3.22s #> 32: learn: 0.0176209 total: 107ms remaining: 3.14s #> 33: learn: 0.0175005 total: 108ms remaining: 3.08s #> 34: learn: 0.0174721 total: 119ms remaining: 3.27s #> 35: learn: 0.0174144 total: 139ms remaining: 3.73s #> 36: learn: 0.0174144 total: 140ms remaining: 3.65s #> 37: learn: 0.0174144 total: 141ms remaining: 3.56s #> 38: learn: 0.0174144 total: 141ms remaining: 3.47s #> 39: learn: 0.0174144 total: 141ms remaining: 3.39s #> 40: learn: 0.0174144 total: 142ms remaining: 3.32s #> 41: learn: 0.0174144 total: 142ms remaining: 3.25s #> 42: learn: 0.0174103 total: 146ms remaining: 3.26s #> 43: learn: 0.0173987 total: 157ms remaining: 3.42s #> 44: learn: 0.0173884 total: 166ms remaining: 3.52s #> 45: learn: 0.0173884 total: 166ms remaining: 3.44s #> 46: learn: 0.0173884 total: 166ms remaining: 3.37s #> 47: learn: 0.0171686 total: 170ms remaining: 3.37s #> 48: learn: 0.0171674 total: 176ms remaining: 3.42s #> 49: learn: 0.0171507 total: 179ms remaining: 3.39s #> 50: learn: 0.0171507 total: 179ms remaining: 3.33s #> 51: learn: 0.0171507 total: 179ms remaining: 3.27s #> 52: learn: 0.0171507 total: 179ms remaining: 3.21s #> 53: learn: 0.0171474 total: 182ms remaining: 3.18s #> 54: learn: 0.0171474 total: 182ms remaining: 3.13s #> 55: learn: 0.0171474 total: 182ms remaining: 3.07s #> 56: learn: 0.0171474 total: 182ms remaining: 3.02s #> 57: learn: 0.0171466 total: 185ms remaining: 3s #> 58: learn: 0.0171466 total: 185ms remaining: 2.95s #> 59: learn: 0.0170944 total: 192ms remaining: 3s #> 60: learn: 0.0170944 total: 192ms remaining: 2.95s #> 61: learn: 0.0170890 total: 194ms remaining: 2.94s #> 62: learn: 0.0170890 total: 194ms remaining: 2.89s #> 63: learn: 0.0169700 total: 202ms remaining: 2.95s #> 64: learn: 0.0169700 total: 202ms remaining: 2.91s #> 65: learn: 0.0167812 total: 209ms remaining: 2.96s #> 66: learn: 0.0166890 total: 219ms remaining: 3.04s #> 67: learn: 0.0166877 total: 221ms remaining: 3.03s #> 68: learn: 0.0166724 total: 222ms remaining: 2.99s #> 69: learn: 0.0166724 total: 222ms remaining: 2.95s #> 70: learn: 0.0166556 total: 222ms remaining: 2.91s #> 71: learn: 0.0166542 total: 225ms remaining: 2.9s #> 72: learn: 0.0166542 total: 225ms remaining: 2.86s #> 73: learn: 0.0165874 total: 228ms remaining: 2.85s #> 74: learn: 0.0165874 total: 228ms remaining: 2.81s #> 75: learn: 0.0165623 total: 233ms remaining: 2.83s #> 76: learn: 0.0165614 total: 236ms remaining: 2.82s #> 77: learn: 0.0165614 total: 236ms remaining: 2.79s #> 78: learn: 0.0164899 total: 241ms remaining: 2.81s #> 79: learn: 0.0164899 total: 242ms remaining: 2.78s #> 80: learn: 0.0164895 total: 249ms remaining: 2.82s #> 81: learn: 0.0164895 total: 249ms remaining: 2.79s #> 82: learn: 0.0164342 total: 250ms remaining: 2.76s #> 83: learn: 0.0164342 total: 250ms remaining: 2.72s #> 84: learn: 0.0164342 total: 250ms remaining: 2.69s #> 85: learn: 0.0164342 total: 250ms remaining: 2.66s #> 86: learn: 0.0164154 total: 257ms remaining: 2.7s #> 87: learn: 0.0163790 total: 267ms remaining: 2.77s #> 88: learn: 0.0163680 total: 287ms remaining: 2.93s #> 89: learn: 0.0163680 total: 287ms remaining: 2.9s #> 90: learn: 0.0162504 total: 297ms remaining: 2.97s #> 91: learn: 0.0161613 total: 303ms remaining: 2.99s #> 92: learn: 0.0161613 total: 304ms remaining: 2.96s #> 93: learn: 0.0161518 total: 311ms remaining: 3s #> 94: learn: 0.0161411 total: 323ms remaining: 3.08s #> 95: learn: 0.0161411 total: 323ms remaining: 3.05s #> 96: learn: 0.0161411 total: 324ms remaining: 3.01s #> 97: learn: 0.0161214 total: 344ms remaining: 3.17s #> 98: learn: 0.0161110 total: 357ms remaining: 3.25s #> 99: learn: 0.0160023 total: 383ms remaining: 3.44s #> 100: learn: 0.0160023 total: 386ms remaining: 3.43s #> 101: learn: 0.0159606 total: 391ms remaining: 3.44s #> 102: learn: 0.0159195 total: 403ms remaining: 3.51s #> 103: learn: 0.0159195 total: 403ms remaining: 3.48s #> 104: learn: 0.0159184 total: 409ms remaining: 3.48s #> 105: learn: 0.0159184 total: 411ms remaining: 3.47s #> 106: learn: 0.0159125 total: 412ms remaining: 3.44s #> 107: learn: 0.0159125 total: 412ms remaining: 3.4s #> 108: learn: 0.0157806 total: 435ms remaining: 3.55s #> 109: learn: 0.0157806 total: 435ms remaining: 3.52s #> 110: learn: 0.0157532 total: 442ms remaining: 3.54s #> 111: learn: 0.0157458 total: 442ms remaining: 3.51s #> 112: learn: 0.0157458 total: 445ms remaining: 3.49s #> 113: learn: 0.0157199 total: 446ms remaining: 3.46s #> 114: learn: 0.0157199 total: 446ms remaining: 3.43s #> 115: learn: 0.0157199 total: 448ms remaining: 3.42s #> 116: learn: 0.0157199 total: 449ms remaining: 3.38s #> 117: learn: 0.0156851 total: 449ms remaining: 3.36s #> 118: learn: 0.0156851 total: 449ms remaining: 3.33s #> 119: learn: 0.0156851 total: 454ms remaining: 3.33s #> 120: learn: 0.0156851 total: 454ms remaining: 3.3s #> 121: learn: 0.0156806 total: 472ms remaining: 3.4s #> 122: learn: 0.0156800 total: 485ms remaining: 3.46s #> 123: learn: 0.0156539 total: 490ms remaining: 3.46s #> 124: learn: 0.0156539 total: 491ms remaining: 3.44s #> 125: learn: 0.0156457 total: 491ms remaining: 3.41s #> 126: learn: 0.0156457 total: 494ms remaining: 3.4s #> 127: learn: 0.0156326 total: 497ms remaining: 3.39s #> 128: learn: 0.0156157 total: 502ms remaining: 3.39s #> 129: learn: 0.0156157 total: 502ms remaining: 3.36s #> 130: learn: 0.0156157 total: 502ms remaining: 3.33s #> 131: learn: 0.0156157 total: 505ms remaining: 3.32s #> 132: learn: 0.0156157 total: 505ms remaining: 3.29s #> 133: learn: 0.0156157 total: 505ms remaining: 3.27s #> 134: learn: 0.0156092 total: 512ms remaining: 3.28s #> 135: learn: 0.0155318 total: 545ms remaining: 3.46s #> 136: learn: 0.0154597 total: 554ms remaining: 3.49s #> 137: learn: 0.0154140 total: 567ms remaining: 3.54s #> 138: learn: 0.0154140 total: 568ms remaining: 3.52s #> 139: learn: 0.0154140 total: 568ms remaining: 3.49s #> 140: learn: 0.0154085 total: 574ms remaining: 3.5s #> 141: learn: 0.0154011 total: 580ms remaining: 3.5s #> 142: learn: 0.0153978 total: 580ms remaining: 3.48s #> 143: learn: 0.0153978 total: 580ms remaining: 3.45s #> 144: learn: 0.0153978 total: 581ms remaining: 3.42s #> 145: learn: 0.0153481 total: 583ms remaining: 3.41s #> 146: learn: 0.0153481 total: 583ms remaining: 3.38s #> 147: learn: 0.0153481 total: 584ms remaining: 3.36s #> 148: learn: 0.0153446 total: 584ms remaining: 3.33s #> 149: learn: 0.0153411 total: 584ms remaining: 3.31s #> 150: learn: 0.0153411 total: 584ms remaining: 3.28s #> 151: learn: 0.0153138 total: 587ms remaining: 3.27s #> 152: learn: 0.0153137 total: 590ms remaining: 3.26s #> 153: learn: 0.0153137 total: 590ms remaining: 3.24s #> 154: learn: 0.0152715 total: 592ms remaining: 3.23s #> 155: learn: 0.0152684 total: 593ms remaining: 3.21s #> 156: learn: 0.0152174 total: 596ms remaining: 3.2s #> 157: learn: 0.0152135 total: 598ms remaining: 3.19s #> 158: learn: 0.0152125 total: 608ms remaining: 3.21s #> 159: learn: 0.0152124 total: 620ms remaining: 3.25s #> 160: learn: 0.0152124 total: 620ms remaining: 3.23s #> 161: learn: 0.0152124 total: 620ms remaining: 3.21s #> 162: learn: 0.0151765 total: 632ms remaining: 3.25s #> 163: learn: 0.0151712 total: 634ms remaining: 3.23s #> 164: learn: 0.0151712 total: 635ms remaining: 3.21s #> 165: learn: 0.0151712 total: 635ms remaining: 3.19s #> 166: learn: 0.0151712 total: 635ms remaining: 3.17s #> 167: learn: 0.0151712 total: 635ms remaining: 3.15s #> 168: learn: 0.0151210 total: 642ms remaining: 3.15s #> 169: learn: 0.0151025 total: 642ms remaining: 3.13s #> 170: learn: 0.0151025 total: 643ms remaining: 3.11s #> 171: learn: 0.0151025 total: 645ms remaining: 3.1s #> 172: learn: 0.0151025 total: 645ms remaining: 3.08s #> 173: learn: 0.0151025 total: 645ms remaining: 3.06s #> 174: learn: 0.0151020 total: 647ms remaining: 3.05s #> 175: learn: 0.0151020 total: 647ms remaining: 3.03s #> 176: learn: 0.0151020 total: 648ms remaining: 3.01s #> 177: learn: 0.0150976 total: 650ms remaining: 3s #> 178: learn: 0.0150762 total: 654ms remaining: 3s #> 179: learn: 0.0150762 total: 654ms remaining: 2.98s #> 180: learn: 0.0150762 total: 654ms remaining: 2.96s #> 181: learn: 0.0150762 total: 654ms remaining: 2.94s #> 182: learn: 0.0150643 total: 655ms remaining: 2.92s #> 183: learn: 0.0149936 total: 659ms remaining: 2.92s #> 184: learn: 0.0149936 total: 660ms remaining: 2.9s #> 185: learn: 0.0149936 total: 660ms remaining: 2.89s #> 186: learn: 0.0149891 total: 660ms remaining: 2.87s #> 187: learn: 0.0149891 total: 660ms remaining: 2.85s #> 188: learn: 0.0149889 total: 664ms remaining: 2.85s #> 189: learn: 0.0149889 total: 665ms remaining: 2.83s #> 190: learn: 0.0149889 total: 665ms remaining: 2.82s #> 191: learn: 0.0149860 total: 681ms remaining: 2.87s #> 192: learn: 0.0149822 total: 683ms remaining: 2.86s #> 193: learn: 0.0149822 total: 683ms remaining: 2.84s #> 194: learn: 0.0149247 total: 688ms remaining: 2.84s #> 195: learn: 0.0149247 total: 688ms remaining: 2.82s #> 196: learn: 0.0149247 total: 688ms remaining: 2.81s #> 197: learn: 0.0149247 total: 689ms remaining: 2.79s #> 198: learn: 0.0149247 total: 689ms remaining: 2.77s #> 199: learn: 0.0149247 total: 689ms remaining: 2.76s #> 200: learn: 0.0149247 total: 689ms remaining: 2.74s #> 201: learn: 0.0149247 total: 690ms remaining: 2.72s #> 202: learn: 0.0149247 total: 692ms remaining: 2.72s #> 203: learn: 0.0149247 total: 692ms remaining: 2.7s #> 204: learn: 0.0149247 total: 692ms remaining: 2.68s #> 205: learn: 0.0149247 total: 693ms remaining: 2.67s #> 206: learn: 0.0149247 total: 693ms remaining: 2.65s #> 207: learn: 0.0149201 total: 693ms remaining: 2.64s #> 208: learn: 0.0148949 total: 696ms remaining: 2.63s #> 209: learn: 0.0148769 total: 698ms remaining: 2.63s #> 210: learn: 0.0148769 total: 698ms remaining: 2.61s #> 211: learn: 0.0148656 total: 700ms remaining: 2.6s #> 212: learn: 0.0148316 total: 704ms remaining: 2.6s #> 213: learn: 0.0148315 total: 706ms remaining: 2.59s #> 214: learn: 0.0148315 total: 707ms remaining: 2.58s #> 215: learn: 0.0147806 total: 714ms remaining: 2.59s #> 216: learn: 0.0147792 total: 716ms remaining: 2.58s #> 217: learn: 0.0146864 total: 719ms remaining: 2.58s #> 218: learn: 0.0146864 total: 719ms remaining: 2.56s #> 219: learn: 0.0146419 total: 727ms remaining: 2.58s #> 220: learn: 0.0146414 total: 729ms remaining: 2.57s #> 221: learn: 0.0146414 total: 730ms remaining: 2.56s #> 222: learn: 0.0146414 total: 732ms remaining: 2.55s #> 223: learn: 0.0146403 total: 734ms remaining: 2.54s #> 224: learn: 0.0146403 total: 734ms remaining: 2.53s #> 225: learn: 0.0146403 total: 734ms remaining: 2.51s #> 226: learn: 0.0146403 total: 734ms remaining: 2.5s #> 227: learn: 0.0146403 total: 736ms remaining: 2.49s #> 228: learn: 0.0146394 total: 739ms remaining: 2.49s #> 229: learn: 0.0145567 total: 744ms remaining: 2.49s #> 230: learn: 0.0145567 total: 744ms remaining: 2.48s #> 231: learn: 0.0145567 total: 744ms remaining: 2.46s #> 232: learn: 0.0145551 total: 746ms remaining: 2.46s #> 233: learn: 0.0145551 total: 748ms remaining: 2.45s #> 234: learn: 0.0145551 total: 748ms remaining: 2.44s #> 235: learn: 0.0144981 total: 758ms remaining: 2.45s #> 236: learn: 0.0144981 total: 759ms remaining: 2.44s #> 237: learn: 0.0144981 total: 759ms remaining: 2.43s #> 238: learn: 0.0144690 total: 763ms remaining: 2.43s #> 239: learn: 0.0144690 total: 763ms remaining: 2.42s #> 240: learn: 0.0144690 total: 763ms remaining: 2.4s #> 241: learn: 0.0144479 total: 771ms remaining: 2.41s #> 242: learn: 0.0144479 total: 771ms remaining: 2.4s #> 243: learn: 0.0144479 total: 771ms remaining: 2.39s #> 244: learn: 0.0144451 total: 774ms remaining: 2.38s #> 245: learn: 0.0144402 total: 778ms remaining: 2.38s #> 246: learn: 0.0144387 total: 780ms remaining: 2.38s #> 247: learn: 0.0144362 total: 780ms remaining: 2.36s #> 248: learn: 0.0144362 total: 780ms remaining: 2.35s #> 249: learn: 0.0144262 total: 784ms remaining: 2.35s #> 250: learn: 0.0144262 total: 787ms remaining: 2.35s #> 251: learn: 0.0143240 total: 805ms remaining: 2.39s #> 252: learn: 0.0143237 total: 807ms remaining: 2.38s #> 253: learn: 0.0143213 total: 810ms remaining: 2.38s #> 254: learn: 0.0142536 total: 830ms remaining: 2.42s #> 255: learn: 0.0142536 total: 831ms remaining: 2.41s #> 256: learn: 0.0142536 total: 831ms remaining: 2.4s #> 257: learn: 0.0142379 total: 846ms remaining: 2.43s #> 258: learn: 0.0141258 total: 867ms remaining: 2.48s #> 259: learn: 0.0141257 total: 872ms remaining: 2.48s #> 260: learn: 0.0141229 total: 882ms remaining: 2.5s #> 261: learn: 0.0141027 total: 889ms remaining: 2.5s #> 262: learn: 0.0141027 total: 889ms remaining: 2.49s #> 263: learn: 0.0141027 total: 891ms remaining: 2.48s #> 264: learn: 0.0141027 total: 892ms remaining: 2.47s #> 265: learn: 0.0141027 total: 892ms remaining: 2.46s #> 266: learn: 0.0141027 total: 892ms remaining: 2.45s #> 267: learn: 0.0141027 total: 892ms remaining: 2.44s #> 268: learn: 0.0141027 total: 892ms remaining: 2.42s #> 269: learn: 0.0141027 total: 893ms remaining: 2.41s #> 270: learn: 0.0141027 total: 893ms remaining: 2.4s #> 271: learn: 0.0141024 total: 895ms remaining: 2.39s #> 272: learn: 0.0141000 total: 897ms remaining: 2.39s #> 273: learn: 0.0141000 total: 897ms remaining: 2.38s #> 274: learn: 0.0141000 total: 897ms remaining: 2.37s #> 275: learn: 0.0141000 total: 898ms remaining: 2.35s #> 276: learn: 0.0141000 total: 898ms remaining: 2.34s #> 277: learn: 0.0141000 total: 898ms remaining: 2.33s #> 278: learn: 0.0141000 total: 898ms remaining: 2.32s #> 279: learn: 0.0141000 total: 899ms remaining: 2.31s #> 280: learn: 0.0141000 total: 899ms remaining: 2.3s #> 281: learn: 0.0141000 total: 899ms remaining: 2.29s #> 282: learn: 0.0141000 total: 899ms remaining: 2.28s #> 283: learn: 0.0141000 total: 901ms remaining: 2.27s #> 284: learn: 0.0141000 total: 902ms remaining: 2.26s #> 285: learn: 0.0141000 total: 902ms remaining: 2.25s #> 286: learn: 0.0140998 total: 904ms remaining: 2.25s #> 287: learn: 0.0140972 total: 904ms remaining: 2.23s #> 288: learn: 0.0140972 total: 906ms remaining: 2.23s #> 289: learn: 0.0140468 total: 912ms remaining: 2.23s #> 290: learn: 0.0140468 total: 912ms remaining: 2.22s #> 291: learn: 0.0140140 total: 920ms remaining: 2.23s #> 292: learn: 0.0140140 total: 921ms remaining: 2.22s #> 293: learn: 0.0140140 total: 921ms remaining: 2.21s #> 294: learn: 0.0140068 total: 923ms remaining: 2.21s #> 295: learn: 0.0140068 total: 925ms remaining: 2.2s #> 296: learn: 0.0140033 total: 928ms remaining: 2.19s #> 297: learn: 0.0139843 total: 933ms remaining: 2.2s #> 298: learn: 0.0139843 total: 933ms remaining: 2.19s #> 299: learn: 0.0139842 total: 935ms remaining: 2.18s #> 300: learn: 0.0139842 total: 936ms remaining: 2.17s #> 301: learn: 0.0139601 total: 938ms remaining: 2.17s #> 302: learn: 0.0139601 total: 938ms remaining: 2.16s #> 303: learn: 0.0139589 total: 942ms remaining: 2.16s #> 304: learn: 0.0139589 total: 942ms remaining: 2.15s #> 305: learn: 0.0139589 total: 942ms remaining: 2.14s #> 306: learn: 0.0139437 total: 952ms remaining: 2.15s #> 307: learn: 0.0139437 total: 952ms remaining: 2.14s #> 308: learn: 0.0138994 total: 958ms remaining: 2.14s #> 309: learn: 0.0138994 total: 959ms remaining: 2.13s #> 310: learn: 0.0138994 total: 959ms remaining: 2.12s #> 311: learn: 0.0138994 total: 959ms remaining: 2.12s #> 312: learn: 0.0138830 total: 965ms remaining: 2.12s #> 313: learn: 0.0138806 total: 966ms remaining: 2.11s #> 314: learn: 0.0138806 total: 966ms remaining: 2.1s #> 315: learn: 0.0138745 total: 981ms remaining: 2.12s #> 316: learn: 0.0138745 total: 981ms remaining: 2.11s #> 317: learn: 0.0138716 total: 986ms remaining: 2.12s #> 318: learn: 0.0138716 total: 987ms remaining: 2.11s #> 319: learn: 0.0138716 total: 987ms remaining: 2.1s #> 320: learn: 0.0138716 total: 987ms remaining: 2.09s #> 321: learn: 0.0138716 total: 988ms remaining: 2.08s #> 322: learn: 0.0138716 total: 988ms remaining: 2.07s #> 323: learn: 0.0138699 total: 991ms remaining: 2.07s #> 324: learn: 0.0138401 total: 1.01s remaining: 2.11s #> 325: learn: 0.0138295 total: 1.02s remaining: 2.12s #> 326: learn: 0.0138285 total: 1.02s remaining: 2.11s #> 327: learn: 0.0137940 total: 1.07s remaining: 2.18s #> 328: learn: 0.0137940 total: 1.07s remaining: 2.17s #> 329: learn: 0.0137940 total: 1.07s remaining: 2.17s #> 330: learn: 0.0137940 total: 1.07s remaining: 2.16s #> 331: learn: 0.0137940 total: 1.07s remaining: 2.15s #> 332: learn: 0.0137878 total: 1.07s remaining: 2.15s #> 333: learn: 0.0137690 total: 1.08s remaining: 2.15s #> 334: learn: 0.0137690 total: 1.08s remaining: 2.14s #> 335: learn: 0.0136740 total: 1.08s remaining: 2.13s #> 336: learn: 0.0136630 total: 1.09s remaining: 2.14s #> 337: learn: 0.0136630 total: 1.09s remaining: 2.13s #> 338: learn: 0.0136630 total: 1.09s remaining: 2.13s #> 339: learn: 0.0136630 total: 1.09s remaining: 2.12s #> 340: learn: 0.0136630 total: 1.09s remaining: 2.11s #> 341: learn: 0.0136630 total: 1.09s remaining: 2.1s #> 342: learn: 0.0136610 total: 1.09s remaining: 2.09s #> 343: learn: 0.0136586 total: 1.09s remaining: 2.08s #> 344: learn: 0.0136586 total: 1.09s remaining: 2.08s #> 345: learn: 0.0136343 total: 1.1s remaining: 2.08s #> 346: learn: 0.0136342 total: 1.1s remaining: 2.07s #> 347: learn: 0.0136335 total: 1.1s remaining: 2.07s #> 348: learn: 0.0136179 total: 1.11s remaining: 2.07s #> 349: learn: 0.0136156 total: 1.11s remaining: 2.07s #> 350: learn: 0.0136067 total: 1.12s remaining: 2.07s #> 351: learn: 0.0136046 total: 1.12s remaining: 2.06s #> 352: learn: 0.0136046 total: 1.12s remaining: 2.05s #> 353: learn: 0.0135611 total: 1.13s remaining: 2.06s #> 354: learn: 0.0135471 total: 1.13s remaining: 2.05s #> 355: learn: 0.0133987 total: 1.14s remaining: 2.06s #> 356: learn: 0.0133987 total: 1.14s remaining: 2.05s #> 357: learn: 0.0133987 total: 1.14s remaining: 2.04s #> 358: learn: 0.0133987 total: 1.14s remaining: 2.03s #> 359: learn: 0.0133980 total: 1.14s remaining: 2.03s #> 360: learn: 0.0133964 total: 1.14s remaining: 2.02s #> 361: learn: 0.0133964 total: 1.14s remaining: 2.01s #> 362: learn: 0.0133964 total: 1.14s remaining: 2.01s #> 363: learn: 0.0133964 total: 1.14s remaining: 2s #> 364: learn: 0.0133964 total: 1.14s remaining: 1.99s #> 365: learn: 0.0133964 total: 1.15s remaining: 1.98s #> 366: learn: 0.0133942 total: 1.15s remaining: 1.98s #> 367: learn: 0.0133936 total: 1.15s remaining: 1.97s #> 368: learn: 0.0133825 total: 1.15s remaining: 1.97s #> 369: learn: 0.0133825 total: 1.15s remaining: 1.96s #> 370: learn: 0.0133825 total: 1.15s remaining: 1.96s #> 371: learn: 0.0133825 total: 1.15s remaining: 1.95s #> 372: learn: 0.0133825 total: 1.15s remaining: 1.94s #> 373: learn: 0.0133822 total: 1.16s remaining: 1.94s #> 374: learn: 0.0133227 total: 1.16s remaining: 1.94s #> 375: learn: 0.0133227 total: 1.16s remaining: 1.93s #> 376: learn: 0.0133218 total: 1.16s remaining: 1.92s #> 377: learn: 0.0133218 total: 1.17s remaining: 1.92s #> 378: learn: 0.0133218 total: 1.17s remaining: 1.91s #> 379: learn: 0.0133194 total: 1.17s remaining: 1.9s #> 380: learn: 0.0132936 total: 1.17s remaining: 1.9s #> 381: learn: 0.0132936 total: 1.17s remaining: 1.9s #> 382: learn: 0.0132497 total: 1.18s remaining: 1.9s #> 383: learn: 0.0132497 total: 1.18s remaining: 1.89s #> 384: learn: 0.0132430 total: 1.18s remaining: 1.89s #> 385: learn: 0.0132122 total: 1.19s remaining: 1.89s #> 386: learn: 0.0130901 total: 1.2s remaining: 1.9s #> 387: learn: 0.0130901 total: 1.2s remaining: 1.9s #> 388: learn: 0.0130901 total: 1.2s remaining: 1.89s #> 389: learn: 0.0130726 total: 1.23s remaining: 1.92s #> 390: learn: 0.0129377 total: 1.24s remaining: 1.93s #> 391: learn: 0.0129369 total: 1.24s remaining: 1.92s #> 392: learn: 0.0129369 total: 1.24s remaining: 1.92s #> 393: learn: 0.0129369 total: 1.24s remaining: 1.91s #> 394: learn: 0.0129348 total: 1.24s remaining: 1.91s #> 395: learn: 0.0129341 total: 1.25s remaining: 1.91s #> 396: learn: 0.0129023 total: 1.26s remaining: 1.92s #> 397: learn: 0.0129005 total: 1.27s remaining: 1.92s #> 398: learn: 0.0128973 total: 1.27s remaining: 1.91s #> 399: learn: 0.0128169 total: 1.28s remaining: 1.92s #> 400: learn: 0.0128157 total: 1.28s remaining: 1.92s #> 401: learn: 0.0128157 total: 1.28s remaining: 1.91s #> 402: learn: 0.0128157 total: 1.28s remaining: 1.9s #> 403: learn: 0.0128157 total: 1.28s remaining: 1.9s #> 404: learn: 0.0127995 total: 1.29s remaining: 1.9s #> 405: learn: 0.0127995 total: 1.29s remaining: 1.89s #> 406: learn: 0.0127956 total: 1.3s remaining: 1.89s #> 407: learn: 0.0127719 total: 1.3s remaining: 1.89s #> 408: learn: 0.0127719 total: 1.3s remaining: 1.88s #> 409: learn: 0.0127719 total: 1.3s remaining: 1.87s #> 410: learn: 0.0127371 total: 1.31s remaining: 1.87s #> 411: learn: 0.0127152 total: 1.31s remaining: 1.87s #> 412: learn: 0.0127152 total: 1.31s remaining: 1.86s #> 413: learn: 0.0127143 total: 1.31s remaining: 1.86s #> 414: learn: 0.0127143 total: 1.31s remaining: 1.85s #> 415: learn: 0.0127143 total: 1.31s remaining: 1.84s #> 416: learn: 0.0127143 total: 1.31s remaining: 1.83s #> 417: learn: 0.0126912 total: 1.32s remaining: 1.83s #> 418: learn: 0.0126850 total: 1.32s remaining: 1.83s #> 419: learn: 0.0126850 total: 1.32s remaining: 1.82s #> 420: learn: 0.0126410 total: 1.32s remaining: 1.82s #> 421: learn: 0.0126410 total: 1.32s remaining: 1.81s #> 422: learn: 0.0126410 total: 1.32s remaining: 1.8s #> 423: learn: 0.0126290 total: 1.33s remaining: 1.8s #> 424: learn: 0.0126290 total: 1.33s remaining: 1.79s #> 425: learn: 0.0126290 total: 1.33s remaining: 1.79s #> 426: learn: 0.0126290 total: 1.33s remaining: 1.78s #> 427: learn: 0.0126290 total: 1.33s remaining: 1.78s #> 428: learn: 0.0126290 total: 1.33s remaining: 1.77s #> 429: learn: 0.0126290 total: 1.33s remaining: 1.76s #> 430: learn: 0.0126290 total: 1.33s remaining: 1.76s #> 431: learn: 0.0126109 total: 1.33s remaining: 1.75s #> 432: learn: 0.0126109 total: 1.33s remaining: 1.75s #> 433: learn: 0.0125960 total: 1.34s remaining: 1.74s #> 434: learn: 0.0124661 total: 1.34s remaining: 1.74s #> 435: learn: 0.0123375 total: 1.35s remaining: 1.74s #> 436: learn: 0.0123375 total: 1.35s remaining: 1.74s #> 437: learn: 0.0123274 total: 1.35s remaining: 1.74s #> 438: learn: 0.0123274 total: 1.35s remaining: 1.73s #> 439: learn: 0.0123274 total: 1.35s remaining: 1.72s #> 440: learn: 0.0122848 total: 1.36s remaining: 1.72s #> 441: learn: 0.0122848 total: 1.36s remaining: 1.72s #> 442: learn: 0.0122848 total: 1.36s remaining: 1.71s #> 443: learn: 0.0122802 total: 1.36s remaining: 1.71s #> 444: learn: 0.0122783 total: 1.36s remaining: 1.7s #> 445: learn: 0.0122783 total: 1.36s remaining: 1.69s #> 446: learn: 0.0122783 total: 1.36s remaining: 1.69s #> 447: learn: 0.0122783 total: 1.36s remaining: 1.68s #> 448: learn: 0.0122742 total: 1.37s remaining: 1.68s #> 449: learn: 0.0122742 total: 1.37s remaining: 1.67s #> 450: learn: 0.0122742 total: 1.37s remaining: 1.67s #> 451: learn: 0.0122364 total: 1.38s remaining: 1.67s #> 452: learn: 0.0122364 total: 1.38s remaining: 1.66s #> 453: learn: 0.0121785 total: 1.38s remaining: 1.66s #> 454: learn: 0.0121324 total: 1.39s remaining: 1.66s #> 455: learn: 0.0121307 total: 1.39s remaining: 1.66s #> 456: learn: 0.0121307 total: 1.39s remaining: 1.65s #> 457: learn: 0.0121306 total: 1.39s remaining: 1.65s #> 458: learn: 0.0121306 total: 1.39s remaining: 1.64s #> 459: learn: 0.0121273 total: 1.39s remaining: 1.64s #> 460: learn: 0.0121273 total: 1.39s remaining: 1.63s #> 461: learn: 0.0121136 total: 1.4s remaining: 1.62s #> 462: learn: 0.0120536 total: 1.4s remaining: 1.62s #> 463: learn: 0.0120536 total: 1.4s remaining: 1.61s #> 464: learn: 0.0120276 total: 1.4s remaining: 1.61s #> 465: learn: 0.0120276 total: 1.4s remaining: 1.61s #> 466: learn: 0.0119840 total: 1.41s remaining: 1.61s #> 467: learn: 0.0119840 total: 1.42s remaining: 1.62s #> 468: learn: 0.0119828 total: 1.42s remaining: 1.61s #> 469: learn: 0.0119828 total: 1.42s remaining: 1.61s #> 470: learn: 0.0119669 total: 1.45s remaining: 1.62s #> 471: learn: 0.0119669 total: 1.45s remaining: 1.62s #> 472: learn: 0.0119669 total: 1.45s remaining: 1.61s #> 473: learn: 0.0119480 total: 1.45s remaining: 1.61s #> 474: learn: 0.0119480 total: 1.45s remaining: 1.6s #> 475: learn: 0.0119480 total: 1.45s remaining: 1.6s #> 476: learn: 0.0119480 total: 1.45s remaining: 1.59s #> 477: learn: 0.0119297 total: 1.46s remaining: 1.59s #> 478: learn: 0.0119297 total: 1.46s remaining: 1.58s #> 479: learn: 0.0118906 total: 1.47s remaining: 1.59s #> 480: learn: 0.0118867 total: 1.47s remaining: 1.59s #> 481: learn: 0.0118867 total: 1.47s remaining: 1.58s #> 482: learn: 0.0118867 total: 1.47s remaining: 1.58s #> 483: learn: 0.0118822 total: 1.48s remaining: 1.58s #> 484: learn: 0.0118822 total: 1.48s remaining: 1.57s #> 485: learn: 0.0118813 total: 1.48s remaining: 1.57s #> 486: learn: 0.0118806 total: 1.48s remaining: 1.56s #> 487: learn: 0.0118778 total: 1.49s remaining: 1.56s #> 488: learn: 0.0118743 total: 1.49s remaining: 1.55s #> 489: learn: 0.0118489 total: 1.49s remaining: 1.55s #> 490: learn: 0.0118488 total: 1.5s remaining: 1.55s #> 491: learn: 0.0118488 total: 1.5s remaining: 1.54s #> 492: learn: 0.0118448 total: 1.5s remaining: 1.54s #> 493: learn: 0.0118447 total: 1.5s remaining: 1.53s #> 494: learn: 0.0118429 total: 1.5s remaining: 1.53s #> 495: learn: 0.0118418 total: 1.5s remaining: 1.53s #> 496: learn: 0.0118418 total: 1.5s remaining: 1.52s #> 497: learn: 0.0118418 total: 1.5s remaining: 1.52s #> 498: learn: 0.0118322 total: 1.51s remaining: 1.51s #> 499: learn: 0.0118322 total: 1.51s remaining: 1.51s #> 500: learn: 0.0118322 total: 1.51s remaining: 1.5s #> 501: learn: 0.0118041 total: 1.51s remaining: 1.5s #> 502: learn: 0.0118041 total: 1.51s remaining: 1.5s #> 503: learn: 0.0118041 total: 1.51s remaining: 1.49s #> 504: learn: 0.0118041 total: 1.51s remaining: 1.49s #> 505: learn: 0.0118019 total: 1.52s remaining: 1.49s #> 506: learn: 0.0118019 total: 1.52s remaining: 1.48s #> 507: learn: 0.0118019 total: 1.52s remaining: 1.47s #> 508: learn: 0.0118019 total: 1.52s remaining: 1.47s #> 509: learn: 0.0117987 total: 1.53s remaining: 1.47s #> 510: learn: 0.0117363 total: 1.53s remaining: 1.47s #> 511: learn: 0.0117363 total: 1.53s remaining: 1.46s #> 512: learn: 0.0117357 total: 1.54s remaining: 1.46s #> 513: learn: 0.0117340 total: 1.54s remaining: 1.46s #> 514: learn: 0.0117340 total: 1.54s remaining: 1.45s #> 515: learn: 0.0117256 total: 1.55s remaining: 1.45s #> 516: learn: 0.0117211 total: 1.55s remaining: 1.45s #> 517: learn: 0.0117194 total: 1.55s remaining: 1.44s #> 518: learn: 0.0117194 total: 1.55s remaining: 1.44s #> 519: learn: 0.0117194 total: 1.55s remaining: 1.43s #> 520: learn: 0.0117194 total: 1.55s remaining: 1.43s #> 521: learn: 0.0117176 total: 1.55s remaining: 1.42s #> 522: learn: 0.0117176 total: 1.55s remaining: 1.42s #> 523: learn: 0.0117047 total: 1.56s remaining: 1.42s #> 524: learn: 0.0117033 total: 1.56s remaining: 1.41s #> 525: learn: 0.0117016 total: 1.56s remaining: 1.41s #> 526: learn: 0.0116564 total: 1.57s remaining: 1.41s #> 527: learn: 0.0116564 total: 1.57s remaining: 1.41s #> 528: learn: 0.0116437 total: 1.58s remaining: 1.4s #> 529: learn: 0.0116414 total: 1.58s remaining: 1.4s #> 530: learn: 0.0116322 total: 1.58s remaining: 1.4s #> 531: learn: 0.0116322 total: 1.58s remaining: 1.39s #> 532: learn: 0.0116189 total: 1.58s remaining: 1.39s #> 533: learn: 0.0116046 total: 1.62s remaining: 1.42s #> 534: learn: 0.0115822 total: 1.63s remaining: 1.41s #> 535: learn: 0.0115706 total: 1.63s remaining: 1.41s #> 536: learn: 0.0115597 total: 1.63s remaining: 1.41s #> 537: learn: 0.0115574 total: 1.63s remaining: 1.4s #> 538: learn: 0.0115574 total: 1.63s remaining: 1.4s #> 539: learn: 0.0115574 total: 1.63s remaining: 1.39s #> 540: learn: 0.0115574 total: 1.63s remaining: 1.39s #> 541: learn: 0.0115574 total: 1.63s remaining: 1.38s #> 542: learn: 0.0115574 total: 1.63s remaining: 1.38s #> 543: learn: 0.0115560 total: 1.64s remaining: 1.37s #> 544: learn: 0.0115511 total: 1.64s remaining: 1.37s #> 545: learn: 0.0115511 total: 1.65s remaining: 1.37s #> 546: learn: 0.0115504 total: 1.65s remaining: 1.36s #> 547: learn: 0.0115169 total: 1.66s remaining: 1.37s #> 548: learn: 0.0115154 total: 1.66s remaining: 1.36s #> 549: learn: 0.0115137 total: 1.66s remaining: 1.36s #> 550: learn: 0.0115121 total: 1.66s remaining: 1.35s #> 551: learn: 0.0115091 total: 1.67s remaining: 1.35s #> 552: learn: 0.0115077 total: 1.67s remaining: 1.35s #> 553: learn: 0.0115077 total: 1.67s remaining: 1.34s #> 554: learn: 0.0114880 total: 1.67s remaining: 1.34s #> 555: learn: 0.0114880 total: 1.67s remaining: 1.33s #> 556: learn: 0.0114880 total: 1.67s remaining: 1.33s #> 557: learn: 0.0114880 total: 1.67s remaining: 1.32s #> 558: learn: 0.0114800 total: 1.68s remaining: 1.32s #> 559: learn: 0.0114800 total: 1.68s remaining: 1.32s #> 560: learn: 0.0114197 total: 1.68s remaining: 1.32s #> 561: learn: 0.0114197 total: 1.68s remaining: 1.31s #> 562: learn: 0.0114197 total: 1.68s remaining: 1.31s #> 563: learn: 0.0114143 total: 1.69s remaining: 1.3s #> 564: learn: 0.0113536 total: 1.69s remaining: 1.3s #> 565: learn: 0.0113419 total: 1.69s remaining: 1.3s #> 566: learn: 0.0113400 total: 1.7s remaining: 1.29s #> 567: learn: 0.0113307 total: 1.7s remaining: 1.29s #> 568: learn: 0.0113307 total: 1.7s remaining: 1.28s #> 569: learn: 0.0113307 total: 1.7s remaining: 1.28s #> 570: learn: 0.0113304 total: 1.7s remaining: 1.28s #> 571: learn: 0.0113304 total: 1.7s remaining: 1.27s #> 572: learn: 0.0113304 total: 1.7s remaining: 1.27s #> 573: learn: 0.0113304 total: 1.7s remaining: 1.26s #> 574: learn: 0.0113291 total: 1.71s remaining: 1.26s #> 575: learn: 0.0113291 total: 1.71s remaining: 1.25s #> 576: learn: 0.0113283 total: 1.71s remaining: 1.25s #> 577: learn: 0.0113283 total: 1.71s remaining: 1.25s #> 578: learn: 0.0113232 total: 1.72s remaining: 1.25s #> 579: learn: 0.0113232 total: 1.72s remaining: 1.24s #> 580: learn: 0.0113217 total: 1.72s remaining: 1.24s #> 581: learn: 0.0113098 total: 1.73s remaining: 1.24s #> 582: learn: 0.0112639 total: 1.73s remaining: 1.24s #> 583: learn: 0.0112537 total: 1.73s remaining: 1.23s #> 584: learn: 0.0111878 total: 1.74s remaining: 1.23s #> 585: learn: 0.0111878 total: 1.74s remaining: 1.23s #> 586: learn: 0.0111878 total: 1.74s remaining: 1.22s #> 587: learn: 0.0111878 total: 1.74s remaining: 1.22s #> 588: learn: 0.0111831 total: 1.74s remaining: 1.21s #> 589: learn: 0.0111831 total: 1.74s remaining: 1.21s #> 590: learn: 0.0111831 total: 1.74s remaining: 1.21s #> 591: learn: 0.0110985 total: 1.75s remaining: 1.21s #> 592: learn: 0.0110985 total: 1.75s remaining: 1.2s #> 593: learn: 0.0110982 total: 1.75s remaining: 1.2s #> 594: learn: 0.0110982 total: 1.75s remaining: 1.19s #> 595: learn: 0.0110929 total: 1.76s remaining: 1.19s #> 596: learn: 0.0110929 total: 1.76s remaining: 1.19s #> 597: learn: 0.0110929 total: 1.76s remaining: 1.18s #> 598: learn: 0.0110929 total: 1.76s remaining: 1.18s #> 599: learn: 0.0110917 total: 1.76s remaining: 1.17s #> 600: learn: 0.0110846 total: 1.76s remaining: 1.17s #> 601: learn: 0.0110831 total: 1.77s remaining: 1.17s #> 602: learn: 0.0110629 total: 1.77s remaining: 1.16s #> 603: learn: 0.0110629 total: 1.77s remaining: 1.16s #> 604: learn: 0.0110629 total: 1.77s remaining: 1.15s #> 605: learn: 0.0110535 total: 1.77s remaining: 1.15s #> 606: learn: 0.0110494 total: 1.78s remaining: 1.15s #> 607: learn: 0.0110480 total: 1.78s remaining: 1.15s #> 608: learn: 0.0110399 total: 1.78s remaining: 1.14s #> 609: learn: 0.0110260 total: 1.78s remaining: 1.14s #> 610: learn: 0.0109539 total: 1.8s remaining: 1.15s #> 611: learn: 0.0109539 total: 1.81s remaining: 1.15s #> 612: learn: 0.0109444 total: 1.81s remaining: 1.14s #> 613: learn: 0.0109294 total: 1.82s remaining: 1.14s #> 614: learn: 0.0109294 total: 1.82s remaining: 1.14s #> 615: learn: 0.0109294 total: 1.82s remaining: 1.13s #> 616: learn: 0.0109294 total: 1.82s remaining: 1.13s #> 617: learn: 0.0109294 total: 1.82s remaining: 1.13s #> 618: learn: 0.0109294 total: 1.82s remaining: 1.12s #> 619: learn: 0.0109258 total: 1.82s remaining: 1.12s #> 620: learn: 0.0109149 total: 1.83s remaining: 1.12s #> 621: learn: 0.0108999 total: 1.83s remaining: 1.11s #> 622: learn: 0.0108999 total: 1.83s remaining: 1.11s #> 623: learn: 0.0108999 total: 1.83s remaining: 1.11s #> 624: learn: 0.0108983 total: 1.83s remaining: 1.1s #> 625: learn: 0.0108983 total: 1.84s remaining: 1.1s #> 626: learn: 0.0108983 total: 1.84s remaining: 1.09s #> 627: learn: 0.0108979 total: 1.84s remaining: 1.09s #> 628: learn: 0.0108979 total: 1.84s remaining: 1.08s #> 629: learn: 0.0108938 total: 1.85s remaining: 1.08s #> 630: learn: 0.0108938 total: 1.85s remaining: 1.08s #> 631: learn: 0.0108938 total: 1.85s remaining: 1.08s #> 632: learn: 0.0108026 total: 1.85s remaining: 1.07s #> 633: learn: 0.0108015 total: 1.86s remaining: 1.07s #> 634: learn: 0.0108011 total: 1.86s remaining: 1.07s #> 635: learn: 0.0107992 total: 1.86s remaining: 1.06s #> 636: learn: 0.0107559 total: 1.87s remaining: 1.06s #> 637: learn: 0.0107380 total: 1.88s remaining: 1.06s #> 638: learn: 0.0107380 total: 1.88s remaining: 1.06s #> 639: learn: 0.0107380 total: 1.88s remaining: 1.06s #> 640: learn: 0.0107380 total: 1.88s remaining: 1.05s #> 641: learn: 0.0107380 total: 1.88s remaining: 1.05s #> 642: learn: 0.0107380 total: 1.88s remaining: 1.04s #> 643: learn: 0.0107247 total: 1.88s remaining: 1.04s #> 644: learn: 0.0107219 total: 1.89s remaining: 1.04s #> 645: learn: 0.0106998 total: 1.89s remaining: 1.04s #> 646: learn: 0.0106868 total: 1.89s remaining: 1.03s #> 647: learn: 0.0106868 total: 1.9s remaining: 1.03s #> 648: learn: 0.0106867 total: 1.9s remaining: 1.03s #> 649: learn: 0.0106848 total: 1.9s remaining: 1.02s #> 650: learn: 0.0106696 total: 1.9s remaining: 1.02s #> 651: learn: 0.0106696 total: 1.9s remaining: 1.01s #> 652: learn: 0.0106687 total: 1.9s remaining: 1.01s #> 653: learn: 0.0106679 total: 1.93s remaining: 1.02s #> 654: learn: 0.0106679 total: 1.93s remaining: 1.01s #> 655: learn: 0.0106679 total: 1.93s remaining: 1.01s #> 656: learn: 0.0106670 total: 1.93s remaining: 1.01s #> 657: learn: 0.0106670 total: 1.93s remaining: 1s #> 658: learn: 0.0106647 total: 1.93s remaining: 999ms #> 659: learn: 0.0106645 total: 1.93s remaining: 995ms #> 660: learn: 0.0106487 total: 1.94s remaining: 995ms #> 661: learn: 0.0106487 total: 1.94s remaining: 991ms #> 662: learn: 0.0106471 total: 1.94s remaining: 986ms #> 663: learn: 0.0106374 total: 1.94s remaining: 984ms #> 664: learn: 0.0106374 total: 1.95s remaining: 980ms #> 665: learn: 0.0106265 total: 1.95s remaining: 978ms #> 666: learn: 0.0106233 total: 1.95s remaining: 974ms #> 667: learn: 0.0106229 total: 1.95s remaining: 971ms #> 668: learn: 0.0106229 total: 1.95s remaining: 967ms #> 669: learn: 0.0106229 total: 1.95s remaining: 963ms #> 670: learn: 0.0106229 total: 1.95s remaining: 958ms #> 671: learn: 0.0106229 total: 1.96s remaining: 954ms #> 672: learn: 0.0106229 total: 1.96s remaining: 950ms #> 673: learn: 0.0106221 total: 1.97s remaining: 953ms #> 674: learn: 0.0106221 total: 1.97s remaining: 949ms #> 675: learn: 0.0106221 total: 1.97s remaining: 945ms #> 676: learn: 0.0106221 total: 1.97s remaining: 941ms #> 677: learn: 0.0106219 total: 1.97s remaining: 936ms #> 678: learn: 0.0106217 total: 1.98s remaining: 935ms #> 679: learn: 0.0106217 total: 1.98s remaining: 931ms #> 680: learn: 0.0105648 total: 1.98s remaining: 928ms #> 681: learn: 0.0105616 total: 1.98s remaining: 925ms #> 682: learn: 0.0105610 total: 1.99s remaining: 923ms #> 683: learn: 0.0105610 total: 1.99s remaining: 919ms #> 684: learn: 0.0105610 total: 1.99s remaining: 915ms #> 685: learn: 0.0105534 total: 2.03s remaining: 930ms #> 686: learn: 0.0105534 total: 2.03s remaining: 927ms #> 687: learn: 0.0105527 total: 2.03s remaining: 922ms #> 688: learn: 0.0105396 total: 2.04s remaining: 920ms #> 689: learn: 0.0104704 total: 2.04s remaining: 918ms #> 690: learn: 0.0104622 total: 2.05s remaining: 916ms #> 691: learn: 0.0104622 total: 2.05s remaining: 912ms #> 692: learn: 0.0104622 total: 2.05s remaining: 908ms #> 693: learn: 0.0104615 total: 2.05s remaining: 905ms #> 694: learn: 0.0104614 total: 2.05s remaining: 901ms #> 695: learn: 0.0104614 total: 2.06s remaining: 898ms #> 696: learn: 0.0104234 total: 2.06s remaining: 897ms #> 697: learn: 0.0104232 total: 2.06s remaining: 894ms #> 698: learn: 0.0104221 total: 2.06s remaining: 890ms #> 699: learn: 0.0104221 total: 2.07s remaining: 885ms #> 700: learn: 0.0104221 total: 2.07s remaining: 882ms #> 701: learn: 0.0104221 total: 2.07s remaining: 878ms #> 702: learn: 0.0104124 total: 2.07s remaining: 875ms #> 703: learn: 0.0104109 total: 2.07s remaining: 871ms #> 704: learn: 0.0104104 total: 2.07s remaining: 867ms #> 705: learn: 0.0104104 total: 2.07s remaining: 863ms #> 706: learn: 0.0104104 total: 2.07s remaining: 859ms #> 707: learn: 0.0104104 total: 2.07s remaining: 855ms #> 708: learn: 0.0103858 total: 2.08s remaining: 853ms #> 709: learn: 0.0103858 total: 2.08s remaining: 849ms #> 710: learn: 0.0103858 total: 2.08s remaining: 845ms #> 711: learn: 0.0103858 total: 2.08s remaining: 841ms #> 712: learn: 0.0103858 total: 2.08s remaining: 837ms #> 713: learn: 0.0103854 total: 2.08s remaining: 833ms #> 714: learn: 0.0103834 total: 2.08s remaining: 830ms #> 715: learn: 0.0102855 total: 2.08s remaining: 827ms #> 716: learn: 0.0102855 total: 2.08s remaining: 823ms #> 717: learn: 0.0102832 total: 2.09s remaining: 820ms #> 718: learn: 0.0102827 total: 2.09s remaining: 817ms #> 719: learn: 0.0102824 total: 2.09s remaining: 814ms #> 720: learn: 0.0102680 total: 2.1s remaining: 812ms #> 721: learn: 0.0102680 total: 2.1s remaining: 809ms #> 722: learn: 0.0102670 total: 2.1s remaining: 805ms #> 723: learn: 0.0102670 total: 2.1s remaining: 801ms #> 724: learn: 0.0102670 total: 2.1s remaining: 797ms #> 725: learn: 0.0102670 total: 2.1s remaining: 794ms #> 726: learn: 0.0102670 total: 2.1s remaining: 791ms #> 727: learn: 0.0102624 total: 2.11s remaining: 788ms #> 728: learn: 0.0102624 total: 2.11s remaining: 784ms #> 729: learn: 0.0101983 total: 2.12s remaining: 783ms #> 730: learn: 0.0101973 total: 2.12s remaining: 779ms #> 731: learn: 0.0101973 total: 2.12s remaining: 775ms #> 732: learn: 0.0101973 total: 2.12s remaining: 772ms #> 733: learn: 0.0101719 total: 2.13s remaining: 771ms #> 734: learn: 0.0101719 total: 2.13s remaining: 767ms #> 735: learn: 0.0101719 total: 2.13s remaining: 763ms #> 736: learn: 0.0101719 total: 2.13s remaining: 759ms #> 737: learn: 0.0101719 total: 2.13s remaining: 755ms #> 738: learn: 0.0101719 total: 2.13s remaining: 752ms #> 739: learn: 0.0101364 total: 2.13s remaining: 749ms #> 740: learn: 0.0101364 total: 2.13s remaining: 746ms #> 741: learn: 0.0101364 total: 2.13s remaining: 742ms #> 742: learn: 0.0101284 total: 2.14s remaining: 741ms #> 743: learn: 0.0101282 total: 2.14s remaining: 737ms #> 744: learn: 0.0101237 total: 2.14s remaining: 734ms #> 745: learn: 0.0101237 total: 2.14s remaining: 730ms #> 746: learn: 0.0101237 total: 2.14s remaining: 726ms #> 747: learn: 0.0101087 total: 2.15s remaining: 725ms #> 748: learn: 0.0101078 total: 2.15s remaining: 722ms #> 749: learn: 0.0100898 total: 2.16s remaining: 720ms #> 750: learn: 0.0100898 total: 2.16s remaining: 717ms #> 751: learn: 0.0100897 total: 2.16s remaining: 713ms #> 752: learn: 0.0100897 total: 2.16s remaining: 710ms #> 753: learn: 0.0100771 total: 2.17s remaining: 709ms #> 754: learn: 0.0100771 total: 2.17s remaining: 705ms #> 755: learn: 0.0100771 total: 2.17s remaining: 701ms #> 756: learn: 0.0100771 total: 2.17s remaining: 698ms #> 757: learn: 0.0100771 total: 2.17s remaining: 694ms #> 758: learn: 0.0100771 total: 2.17s remaining: 690ms #> 759: learn: 0.0100757 total: 2.17s remaining: 687ms #> 760: learn: 0.0100757 total: 2.18s remaining: 683ms #> 761: learn: 0.0100753 total: 2.18s remaining: 681ms #> 762: learn: 0.0100660 total: 2.19s remaining: 679ms #> 763: learn: 0.0100457 total: 2.19s remaining: 675ms #> 764: learn: 0.0100457 total: 2.19s remaining: 672ms #> 765: learn: 0.0100424 total: 2.19s remaining: 668ms #> 766: learn: 0.0100424 total: 2.19s remaining: 665ms #> 767: learn: 0.0100424 total: 2.19s remaining: 661ms #> 768: learn: 0.0100339 total: 2.19s remaining: 659ms #> 769: learn: 0.0100339 total: 2.19s remaining: 655ms #> 770: learn: 0.0100070 total: 2.21s remaining: 656ms #> 771: learn: 0.0100069 total: 2.21s remaining: 653ms #> 772: learn: 0.0100004 total: 2.22s remaining: 652ms #> 773: learn: 0.0099977 total: 2.22s remaining: 649ms #> 774: learn: 0.0099906 total: 2.23s remaining: 647ms #> 775: learn: 0.0099906 total: 2.23s remaining: 644ms #> 776: learn: 0.0099906 total: 2.23s remaining: 640ms #> 777: learn: 0.0099906 total: 2.23s remaining: 636ms #> 778: learn: 0.0099834 total: 2.23s remaining: 634ms #> 779: learn: 0.0099825 total: 2.23s remaining: 631ms #> 780: learn: 0.0099825 total: 2.24s remaining: 628ms #> 781: learn: 0.0099825 total: 2.24s remaining: 624ms #> 782: learn: 0.0099781 total: 2.24s remaining: 622ms #> 783: learn: 0.0099750 total: 2.25s remaining: 619ms #> 784: learn: 0.0099736 total: 2.25s remaining: 616ms #> 785: learn: 0.0099593 total: 2.25s remaining: 613ms #> 786: learn: 0.0099276 total: 2.26s remaining: 611ms #> 787: learn: 0.0099158 total: 2.26s remaining: 607ms #> 788: learn: 0.0099157 total: 2.26s remaining: 604ms #> 789: learn: 0.0099157 total: 2.26s remaining: 601ms #> 790: learn: 0.0098844 total: 2.27s remaining: 601ms #> 791: learn: 0.0098844 total: 2.28s remaining: 598ms #> 792: learn: 0.0098106 total: 2.29s remaining: 597ms #> 793: learn: 0.0098106 total: 2.29s remaining: 594ms #> 794: learn: 0.0098046 total: 2.29s remaining: 590ms #> 795: learn: 0.0097968 total: 2.29s remaining: 587ms #> 796: learn: 0.0097615 total: 2.3s remaining: 586ms #> 797: learn: 0.0097613 total: 2.31s remaining: 584ms #> 798: learn: 0.0097605 total: 2.31s remaining: 580ms #> 799: learn: 0.0097605 total: 2.31s remaining: 577ms #> 800: learn: 0.0097605 total: 2.31s remaining: 574ms #> 801: learn: 0.0097605 total: 2.31s remaining: 570ms #> 802: learn: 0.0097605 total: 2.31s remaining: 567ms #> 803: learn: 0.0097471 total: 2.32s remaining: 565ms #> 804: learn: 0.0097471 total: 2.32s remaining: 561ms #> 805: learn: 0.0097334 total: 2.33s remaining: 560ms #> 806: learn: 0.0097319 total: 2.33s remaining: 557ms #> 807: learn: 0.0097319 total: 2.33s remaining: 554ms #> 808: learn: 0.0096868 total: 2.33s remaining: 551ms #> 809: learn: 0.0096867 total: 2.34s remaining: 548ms #> 810: learn: 0.0096585 total: 2.34s remaining: 546ms #> 811: learn: 0.0096584 total: 2.35s remaining: 544ms #> 812: learn: 0.0096529 total: 2.35s remaining: 541ms #> 813: learn: 0.0096529 total: 2.35s remaining: 538ms #> 814: learn: 0.0096529 total: 2.35s remaining: 535ms #> 815: learn: 0.0096529 total: 2.35s remaining: 531ms #> 816: learn: 0.0096529 total: 2.35s remaining: 528ms #> 817: learn: 0.0096529 total: 2.36s remaining: 525ms #> 818: learn: 0.0096529 total: 2.36s remaining: 521ms #> 819: learn: 0.0096529 total: 2.36s remaining: 518ms #> 820: learn: 0.0096529 total: 2.36s remaining: 514ms #> 821: learn: 0.0096526 total: 2.36s remaining: 511ms #> 822: learn: 0.0096461 total: 2.36s remaining: 508ms #> 823: learn: 0.0096461 total: 2.36s remaining: 505ms #> 824: learn: 0.0096461 total: 2.36s remaining: 501ms #> 825: learn: 0.0096380 total: 2.37s remaining: 499ms #> 826: learn: 0.0096305 total: 2.38s remaining: 497ms #> 827: learn: 0.0096305 total: 2.38s remaining: 493ms #> 828: learn: 0.0096265 total: 2.38s remaining: 491ms #> 829: learn: 0.0096259 total: 2.38s remaining: 488ms #> 830: learn: 0.0096259 total: 2.38s remaining: 485ms #> 831: learn: 0.0095395 total: 2.41s remaining: 487ms #> 832: learn: 0.0095373 total: 2.42s remaining: 484ms #> 833: learn: 0.0095373 total: 2.42s remaining: 481ms #> 834: learn: 0.0095367 total: 2.42s remaining: 477ms #> 835: learn: 0.0095367 total: 2.42s remaining: 474ms #> 836: learn: 0.0095367 total: 2.42s remaining: 471ms #> 837: learn: 0.0095343 total: 2.42s remaining: 468ms #> 838: learn: 0.0095330 total: 2.42s remaining: 465ms #> 839: learn: 0.0095320 total: 2.43s remaining: 462ms #> 840: learn: 0.0094688 total: 2.44s remaining: 462ms #> 841: learn: 0.0094688 total: 2.45s remaining: 459ms #> 842: learn: 0.0094566 total: 2.45s remaining: 456ms #> 843: learn: 0.0094566 total: 2.45s remaining: 453ms #> 844: learn: 0.0094426 total: 2.46s remaining: 451ms #> 845: learn: 0.0094281 total: 2.47s remaining: 449ms #> 846: learn: 0.0094268 total: 2.47s remaining: 446ms #> 847: learn: 0.0094268 total: 2.47s remaining: 443ms #> 848: learn: 0.0094115 total: 2.48s remaining: 441ms #> 849: learn: 0.0094085 total: 2.48s remaining: 437ms #> 850: learn: 0.0093546 total: 2.49s remaining: 436ms #> 851: learn: 0.0093545 total: 2.49s remaining: 433ms #> 852: learn: 0.0093536 total: 2.5s remaining: 430ms #> 853: learn: 0.0093533 total: 2.5s remaining: 427ms #> 854: learn: 0.0093533 total: 2.5s remaining: 424ms #> 855: learn: 0.0093533 total: 2.5s remaining: 421ms #> 856: learn: 0.0093487 total: 2.5s remaining: 418ms #> 857: learn: 0.0093487 total: 2.5s remaining: 414ms #> 858: learn: 0.0093466 total: 2.5s remaining: 411ms #> 859: learn: 0.0093466 total: 2.5s remaining: 408ms #> 860: learn: 0.0093466 total: 2.51s remaining: 405ms #> 861: learn: 0.0093458 total: 2.51s remaining: 401ms #> 862: learn: 0.0093458 total: 2.51s remaining: 398ms #> 863: learn: 0.0093204 total: 2.51s remaining: 396ms #> 864: learn: 0.0093204 total: 2.51s remaining: 393ms #> 865: learn: 0.0092664 total: 2.52s remaining: 390ms #> 866: learn: 0.0092538 total: 2.53s remaining: 388ms #> 867: learn: 0.0092536 total: 2.53s remaining: 385ms #> 868: learn: 0.0092536 total: 2.53s remaining: 382ms #> 869: learn: 0.0092475 total: 2.53s remaining: 379ms #> 870: learn: 0.0092475 total: 2.54s remaining: 376ms #> 871: learn: 0.0092234 total: 2.54s remaining: 373ms #> 872: learn: 0.0092234 total: 2.54s remaining: 369ms #> 873: learn: 0.0092180 total: 2.54s remaining: 367ms #> 874: learn: 0.0092062 total: 2.55s remaining: 364ms #> 875: learn: 0.0091984 total: 2.55s remaining: 361ms #> 876: learn: 0.0091984 total: 2.55s remaining: 358ms #> 877: learn: 0.0091972 total: 2.56s remaining: 355ms #> 878: learn: 0.0091972 total: 2.56s remaining: 352ms #> 879: learn: 0.0091972 total: 2.56s remaining: 349ms #> 880: learn: 0.0091972 total: 2.56s remaining: 346ms #> 881: learn: 0.0091972 total: 2.56s remaining: 342ms #> 882: learn: 0.0091972 total: 2.56s remaining: 339ms #> 883: learn: 0.0091941 total: 2.56s remaining: 336ms #> 884: learn: 0.0091937 total: 2.56s remaining: 333ms #> 885: learn: 0.0091937 total: 2.56s remaining: 330ms #> 886: learn: 0.0091824 total: 2.57s remaining: 327ms #> 887: learn: 0.0091824 total: 2.57s remaining: 324ms #> 888: learn: 0.0091820 total: 2.57s remaining: 321ms #> 889: learn: 0.0091807 total: 2.57s remaining: 318ms #> 890: learn: 0.0091331 total: 2.58s remaining: 315ms #> 891: learn: 0.0091326 total: 2.58s remaining: 312ms #> 892: learn: 0.0091305 total: 2.61s remaining: 313ms #> 893: learn: 0.0091305 total: 2.61s remaining: 310ms #> 894: learn: 0.0091305 total: 2.61s remaining: 306ms #> 895: learn: 0.0091305 total: 2.61s remaining: 303ms #> 896: learn: 0.0091305 total: 2.61s remaining: 300ms #> 897: learn: 0.0091305 total: 2.62s remaining: 297ms #> 898: learn: 0.0091305 total: 2.62s remaining: 294ms #> 899: learn: 0.0091304 total: 2.62s remaining: 291ms #> 900: learn: 0.0091238 total: 2.62s remaining: 288ms #> 901: learn: 0.0091238 total: 2.62s remaining: 285ms #> 902: learn: 0.0091040 total: 2.63s remaining: 282ms #> 903: learn: 0.0091040 total: 2.63s remaining: 279ms #> 904: learn: 0.0091040 total: 2.63s remaining: 276ms #> 905: learn: 0.0091040 total: 2.63s remaining: 273ms #> 906: learn: 0.0091040 total: 2.63s remaining: 269ms #> 907: learn: 0.0091040 total: 2.63s remaining: 266ms #> 908: learn: 0.0091040 total: 2.63s remaining: 263ms #> 909: learn: 0.0091037 total: 2.63s remaining: 260ms #> 910: learn: 0.0091036 total: 2.63s remaining: 257ms #> 911: learn: 0.0091019 total: 2.63s remaining: 254ms #> 912: learn: 0.0091019 total: 2.63s remaining: 251ms #> 913: learn: 0.0090876 total: 2.63s remaining: 248ms #> 914: learn: 0.0090876 total: 2.64s remaining: 245ms #> 915: learn: 0.0090876 total: 2.64s remaining: 242ms #> 916: learn: 0.0090876 total: 2.64s remaining: 239ms #> 917: learn: 0.0090867 total: 2.64s remaining: 236ms #> 918: learn: 0.0090867 total: 2.64s remaining: 233ms #> 919: learn: 0.0090867 total: 2.64s remaining: 230ms #> 920: learn: 0.0090867 total: 2.64s remaining: 227ms #> 921: learn: 0.0090864 total: 2.65s remaining: 224ms #> 922: learn: 0.0090864 total: 2.65s remaining: 221ms #> 923: learn: 0.0090864 total: 2.65s remaining: 218ms #> 924: learn: 0.0090864 total: 2.65s remaining: 215ms #> 925: learn: 0.0090864 total: 2.65s remaining: 212ms #> 926: learn: 0.0090864 total: 2.65s remaining: 209ms #> 927: learn: 0.0090836 total: 2.65s remaining: 206ms #> 928: learn: 0.0090836 total: 2.65s remaining: 203ms #> 929: learn: 0.0090831 total: 2.66s remaining: 200ms #> 930: learn: 0.0090831 total: 2.66s remaining: 197ms #> 931: learn: 0.0090776 total: 2.66s remaining: 194ms #> 932: learn: 0.0090441 total: 2.66s remaining: 191ms #> 933: learn: 0.0090368 total: 2.67s remaining: 189ms #> 934: learn: 0.0090354 total: 2.67s remaining: 186ms #> 935: learn: 0.0090354 total: 2.67s remaining: 183ms #> 936: learn: 0.0090354 total: 2.67s remaining: 180ms #> 937: learn: 0.0090354 total: 2.67s remaining: 177ms #> 938: learn: 0.0090336 total: 2.68s remaining: 174ms #> 939: learn: 0.0090336 total: 2.68s remaining: 171ms #> 940: learn: 0.0090326 total: 2.68s remaining: 168ms #> 941: learn: 0.0090326 total: 2.68s remaining: 165ms #> 942: learn: 0.0090326 total: 2.68s remaining: 162ms #> 943: learn: 0.0090326 total: 2.68s remaining: 159ms #> 944: learn: 0.0090304 total: 2.69s remaining: 156ms #> 945: learn: 0.0090304 total: 2.69s remaining: 153ms #> 946: learn: 0.0090304 total: 2.69s remaining: 150ms #> 947: learn: 0.0090304 total: 2.69s remaining: 147ms #> 948: learn: 0.0090284 total: 2.69s remaining: 144ms #> 949: learn: 0.0090284 total: 2.69s remaining: 141ms #> 950: learn: 0.0090161 total: 2.69s remaining: 139ms #> 951: learn: 0.0090161 total: 2.69s remaining: 136ms #> 952: learn: 0.0090160 total: 2.69s remaining: 133ms #> 953: learn: 0.0090017 total: 2.7s remaining: 130ms #> 954: learn: 0.0089888 total: 2.71s remaining: 127ms #> 955: learn: 0.0089888 total: 2.71s remaining: 125ms #> 956: learn: 0.0089888 total: 2.71s remaining: 122ms #> 957: learn: 0.0089888 total: 2.71s remaining: 119ms #> 958: learn: 0.0089888 total: 2.71s remaining: 116ms #> 959: learn: 0.0089877 total: 2.71s remaining: 113ms #> 960: learn: 0.0089876 total: 2.71s remaining: 110ms #> 961: learn: 0.0089876 total: 2.72s remaining: 107ms #> 962: learn: 0.0089850 total: 2.72s remaining: 105ms #> 963: learn: 0.0089850 total: 2.72s remaining: 102ms #> 964: learn: 0.0089849 total: 2.72s remaining: 98.7ms #> 965: learn: 0.0089849 total: 2.72s remaining: 95.8ms #> 966: learn: 0.0089849 total: 2.72s remaining: 92.9ms #> 967: learn: 0.0089845 total: 2.72s remaining: 90.1ms #> 968: learn: 0.0089828 total: 2.73s remaining: 87.3ms #> 969: learn: 0.0089322 total: 2.74s remaining: 84.6ms #> 970: learn: 0.0089322 total: 2.74s remaining: 81.7ms #> 971: learn: 0.0089322 total: 2.74s remaining: 78.8ms #> 972: learn: 0.0089322 total: 2.74s remaining: 75.9ms #> 973: learn: 0.0089322 total: 2.74s remaining: 73.1ms #> 974: learn: 0.0089322 total: 2.74s remaining: 70.2ms #> 975: learn: 0.0089321 total: 2.74s remaining: 67.4ms #> 976: learn: 0.0089321 total: 2.74s remaining: 64.5ms #> 977: learn: 0.0089299 total: 2.74s remaining: 61.7ms #> 978: learn: 0.0089260 total: 2.75s remaining: 58.9ms #> 979: learn: 0.0088889 total: 2.75s remaining: 56.1ms #> 980: learn: 0.0088889 total: 2.75s remaining: 53.4ms #> 981: learn: 0.0088879 total: 2.77s remaining: 50.7ms #> 982: learn: 0.0088879 total: 2.77s remaining: 47.8ms #> 983: learn: 0.0088879 total: 2.77s remaining: 45ms #> 984: learn: 0.0088724 total: 2.79s remaining: 42.5ms #> 985: learn: 0.0088711 total: 2.79s remaining: 39.7ms #> 986: learn: 0.0088663 total: 2.8s remaining: 36.9ms #> 987: learn: 0.0088535 total: 2.8s remaining: 34.1ms #> 988: learn: 0.0088466 total: 2.81s remaining: 31.2ms #> 989: learn: 0.0088466 total: 2.81s remaining: 28.4ms #> 990: learn: 0.0088289 total: 2.82s remaining: 25.6ms #> 991: learn: 0.0088245 total: 2.82s remaining: 22.7ms #> 992: learn: 0.0088245 total: 2.82s remaining: 19.9ms #> 993: learn: 0.0088244 total: 2.82s remaining: 17ms #> 994: learn: 0.0088214 total: 2.83s remaining: 14.2ms #> 995: learn: 0.0088214 total: 2.83s remaining: 11.4ms #> 996: learn: 0.0088015 total: 2.83s remaining: 8.53ms #> 997: learn: 0.0088011 total: 2.83s remaining: 5.68ms #> 998: learn: 0.0087907 total: 2.84s remaining: 2.84ms #> 999: learn: 0.0087907 total: 2.84s remaining: 0usmodel_fit#> parsnip model object #> #> Fit time: 5.9s #> Prophet Model w/ Catboost Error Specification #> --- #> Model 1: PROPHET #> $growth #> [1] "linear" #> #> $changepoints #> [1] "1990-09-01 GMT" "1991-05-01 GMT" "1991-12-01 GMT" "1992-08-01 GMT" #> [5] "1993-04-01 GMT" "1993-12-01 GMT" "1994-07-01 GMT" "1995-03-01 GMT" #> [9] "1995-11-01 GMT" "1996-07-01 GMT" "1997-02-01 GMT" "1997-10-01 GMT" #> [13] "1998-06-01 GMT" "1999-02-01 GMT" "1999-09-01 GMT" "2000-05-01 GMT" #> [17] "2001-01-01 GMT" "2001-09-01 GMT" "2002-04-01 GMT" "2002-12-01 GMT" #> [21] "2003-08-01 GMT" "2004-04-01 GMT" "2004-11-01 GMT" "2005-07-01 GMT" #> [25] "2006-03-01 GMT" #> #> $n.changepoints #> [1] 25 #> #> $changepoint.range #> [1] 0.8 #> #> $yearly.seasonality #> [1] "auto" #> #> $weekly.seasonality #> [1] "auto" #> #> $daily.seasonality #> [1] "auto" #> #> $holidays #> NULL #> #> $seasonality.mode #> [1] "additive" #> #> $seasonality.prior.scale #> [1] 10 #> #> $changepoint.prior.scale #> [1] 0.05 #> #> $holidays.prior.scale #> [1] 10 #> #> $mcmc.samples #> [1] 0 #> #> $interval.width #> [1] 0.8 #> #> $uncertainty.samples #> [1] 1000 #> #> $specified.changepoints #> [1] FALSE #> #> $start #> [1] "1990-01-01 GMT" #> #> $y.scale #> [1] 9.289152 #> #> $logistic.floor #> [1] FALSE #> #> $t.scale #> [1] 638928000 #> #> $changepoints.t #> [1] 0.03286004 0.06558485 0.09452333 0.12751859 0.16037863 0.19337390 #> [7] 0.22204192 0.25490196 0.28803245 0.32089249 0.34996619 0.38269101 #> [13] 0.41555105 0.44868154 0.47734956 0.51020960 0.54334009 0.57620014 #> [19] 0.60486815 0.63786342 0.67072346 0.70371873 0.73265720 0.76538201 #> [25] 0.79824206 #> #> $seasonalities #> $seasonalities$yearly #> $seasonalities$yearly$period #> [1] 365.25 #> #> $seasonalities$yearly$fourier.order #> [1] 10 #> #> $seasonalities$yearly$prior.scale #> [1] 10 #> #> $seasonalities$yearly$mode #> [1] "additive" #> #> $seasonalities$yearly$condition.name #> NULL #> #> #> #> $extra_regressors #> list() #> #> $country_holidays #> NULL #> #> $stan.fit #> $stan.fit$par #> $stan.fit$par$k #> [1] 0.1334132 #> #> $stan.fit$par$m #> [1] 0.9415521 #> #> $stan.fit$par$delta #> [1] -3.069999e-02 -3.836996e-02 -7.255556e-02 -7.495696e-03 5.416064e-02 #> [6] 9.030884e-03 5.452900e-03 3.015859e-02 2.952826e-02 1.192383e-06 #> [11] -2.170273e-02 3.001226e-09 4.729257e-02 -2.482628e-10 -9.950836e-02 #> [16] -2.494600e-09 1.328259e-01 1.609158e-02 -2.256983e-01 -1.135256e-04 #> [21] 4.319331e-02 9.348530e-08 -1.233810e-02 -1.388182e-01 1.932328e-01 #> #> $stan.fit$par$sigma_obs #> [1] 0.001990938 #> #> $stan.fit$par$beta #> [1] 0.0025547391 0.0048284804 -0.0029179125 -0.0053958307 0.0037308771 #> [6] 0.0035946331 -0.0040725564 -0.0022530524 0.0027582578 0.0019500875 #> [11] -0.0044812421 -0.0002742276 0.0025300239 -0.0012144792 -0.0030948244 #> [16] 0.0010920830 0.0024337117 -0.0021357125 -0.0003163449 0.0020500926 #> #> $stan.fit$par$trend #> [1] 0.9415521 0.9421113 0.9426165 0.9431758 0.9437170 0.9442763 0.9448175 #> [8] 0.9453768 0.9459360 0.9463527 0.9467833 0.9472000 0.9476306 0.9480611 #> [15] 0.9484500 0.9488806 0.9492973 0.9495670 0.9498281 0.9500978 0.9503675 #> [22] 0.9506285 0.9508983 0.9511593 0.9511249 0.9510905 0.9510582 0.9510238 #> [29] 0.9509905 0.9509561 0.9509228 0.9508883 0.9508225 0.9507588 0.9506929 #> [36] 0.9506292 0.9505633 0.9504975 0.9504380 0.9503722 0.9505282 0.9506894 #> [43] 0.9508454 0.9510066 0.9511677 0.9513237 0.9514849 0.9516409 0.9518400 #> [50] 0.9520390 0.9522188 0.9524179 0.9526105 0.9528096 0.9530022 0.9532241 #> [57] 0.9534460 0.9536608 0.9538827 0.9540974 0.9543193 0.9545412 0.9547417 #> [64] 0.9550900 0.9554271 0.9557755 0.9561126 0.9564609 0.9568092 0.9571463 #> [71] 0.9574947 0.9579516 0.9584237 0.9588958 0.9593375 0.9598096 0.9602665 #> [78] 0.9607386 0.9611955 0.9616676 0.9621397 0.9625966 0.9630687 0.9635256 #> [85] 0.9639978 0.9644699 0.9648141 0.9651953 0.9655641 0.9659453 0.9663141 #> [92] 0.9666953 0.9670764 0.9674453 0.9678264 0.9681953 0.9685764 0.9689576 #> [99] 0.9693018 0.9696830 0.9700518 0.9704330 0.9709937 0.9715731 0.9721525 #> [106] 0.9727132 0.9732926 0.9738533 0.9744327 0.9750121 0.9755354 0.9761148 #> [113] 0.9766755 0.9772549 0.9778156 0.9783950 0.9789744 0.9791314 0.9792937 #> [120] 0.9794507 0.9796130 0.9797752 0.9799270 0.9800893 0.9802463 0.9804085 #> [127] 0.9805656 0.9807278 0.9808901 0.9810471 0.9812094 0.9813664 0.9815286 #> [134] 0.9822477 0.9828972 0.9836162 0.9843121 0.9850312 0.9857270 0.9864461 #> [141] 0.9871652 0.9879263 0.9887128 0.9894740 0.9902605 0.9910470 0.9917574 #> [148] 0.9925440 0.9923895 0.9922299 0.9920754 0.9919158 0.9917562 0.9916017 #> [155] 0.9914421 0.9912877 0.9911276 0.9909675 0.9908229 0.9906628 0.9905079 #> [162] 0.9903478 0.9901929 0.9900328 0.9900538 0.9900741 0.9900951 0.9901154 #> [169] 0.9901364 0.9901573 0.9901770 0.9901979 0.9902182 0.9902392 0.9902595 #> [176] 0.9902805 0.9903015 0.9903218 0.9903428 0.9903130 0.9902823 0.9902515 #> [183] 0.9902238 0.9901930 0.9901633 0.9901325 0.9901028 0.9894901 0.9888775 #> [190] 0.9882846 0.9876719 0.9870790 0.9864663 0.9858536 0.9853003 0.9854976 #> [197] 0.9856886 0.9858860 0.9860770 0.9862744 0.9864717 0.9866627 0.9868601 #> [204] 0.9870511 0.9872485 0.9874458 0.9876241 0.9878215 0.9880125 0.9882098 #> [211] 0.9884008 0.9885982 0.9887956 0.9889866 0.9891839 0.9893749 0.9895723 #> [218] 0.9897697 0.9899543 0.9901517 0.9903427 0.9905400 0.9907310 0.9909284 #> [225] 0.9911258 0.9913168 0.9915141 0.9917051 0.9919025 0.9920999 0.9922781 #> [232] 0.9924755 0.9926665 0.9928638 0.9930548 0.9932522 0.9934496 0.9936406 #> [239] 0.9938379 0.9940289 0.9942263 0.9944237 0.9946019 0.9947993 #> #> #> $stan.fit$value #> [1] 1371.543 #> #> $stan.fit$return_code #> [1] 0 #> #> $stan.fit$theta_tilde #> k m delta[1] delta[2] delta[3] delta[4] #> [1,] 0.1334132 0.9415521 -0.03069999 -0.03836996 -0.07255556 -0.007495696 #> delta[5] delta[6] delta[7] delta[8] delta[9] delta[10] #> [1,] 0.05416064 0.009030884 0.0054529 0.03015859 0.02952826 1.192383e-06 #> delta[11] delta[12] delta[13] delta[14] delta[15] delta[16] #> [1,] -0.02170273 3.001226e-09 0.04729257 -2.482628e-10 -0.09950836 -2.4946e-09 #> delta[17] delta[18] delta[19] delta[20] delta[21] delta[22] #> [1,] 0.1328259 0.01609158 -0.2256983 -0.0001135256 0.04319331 9.34853e-08 #> delta[23] delta[24] delta[25] sigma_obs beta[1] beta[2] #> [1,] -0.0123381 -0.1388182 0.1932328 0.001990938 0.002554739 0.00482848 #> beta[3] beta[4] beta[5] beta[6] beta[7] #> [1,] -0.002917912 -0.005395831 0.003730877 0.003594633 -0.004072556 #> beta[8] beta[9] beta[10] beta[11] beta[12] #> [1,] -0.002253052 0.002758258 0.001950088 -0.004481242 -0.0002742276 #> beta[13] beta[14] beta[15] beta[16] beta[17] beta[18] #> [1,] 0.002530024 -0.001214479 -0.003094824 0.001092083 0.002433712 -0.002135713 #> beta[19] beta[20] trend[1] trend[2] trend[3] trend[4] trend[5] #> [1,] -0.0003163449 0.002050093 0.9415521 0.9421113 0.9426165 0.9431758 0.943717 #> trend[6] trend[7] trend[8] trend[9] trend[10] trend[11] trend[12] #> [1,] 0.9442763 0.9448175 0.9453768 0.945936 0.9463527 0.9467833 0.9472 #> trend[13] trend[14] trend[15] trend[16] trend[17] trend[18] trend[19] #> [1,] 0.9476306 0.9480611 0.94845 0.9488806 0.9492973 0.949567 0.9498281 #> trend[20] trend[21] trend[22] trend[23] trend[24] trend[25] trend[26] #> [1,] 0.9500978 0.9503675 0.9506285 0.9508983 0.9511593 0.9511249 0.9510905 #> trend[27] trend[28] trend[29] trend[30] trend[31] trend[32] trend[33] #> [1,] 0.9510582 0.9510238 0.9509905 0.9509561 0.9509228 0.9508883 0.9508225 #> trend[34] trend[35] trend[36] trend[37] trend[38] trend[39] trend[40] #> [1,] 0.9507588 0.9506929 0.9506292 0.9505633 0.9504975 0.950438 0.9503722 #> trend[41] trend[42] trend[43] trend[44] trend[45] trend[46] trend[47] #> [1,] 0.9505282 0.9506894 0.9508454 0.9510066 0.9511677 0.9513237 0.9514849 #> trend[48] trend[49] trend[50] trend[51] trend[52] trend[53] trend[54] #> [1,] 0.9516409 0.95184 0.952039 0.9522188 0.9524179 0.9526105 0.9528096 #> trend[55] trend[56] trend[57] trend[58] trend[59] trend[60] trend[61] #> [1,] 0.9530022 0.9532241 0.953446 0.9536608 0.9538827 0.9540974 0.9543193 #> trend[62] trend[63] trend[64] trend[65] trend[66] trend[67] trend[68] #> [1,] 0.9545412 0.9547417 0.95509 0.9554271 0.9557755 0.9561126 0.9564609 #> trend[69] trend[70] trend[71] trend[72] trend[73] trend[74] trend[75] #> [1,] 0.9568092 0.9571463 0.9574947 0.9579516 0.9584237 0.9588958 0.9593375 #> trend[76] trend[77] trend[78] trend[79] trend[80] trend[81] trend[82] #> [1,] 0.9598096 0.9602665 0.9607386 0.9611955 0.9616676 0.9621397 0.9625966 #> trend[83] trend[84] trend[85] trend[86] trend[87] trend[88] trend[89] #> [1,] 0.9630687 0.9635256 0.9639978 0.9644699 0.9648141 0.9651953 0.9655641 #> trend[90] trend[91] trend[92] trend[93] trend[94] trend[95] trend[96] #> [1,] 0.9659453 0.9663141 0.9666953 0.9670764 0.9674453 0.9678264 0.9681953 #> trend[97] trend[98] trend[99] trend[100] trend[101] trend[102] trend[103] #> [1,] 0.9685764 0.9689576 0.9693018 0.969683 0.9700518 0.970433 0.9709937 #> trend[104] trend[105] trend[106] trend[107] trend[108] trend[109] #> [1,] 0.9715731 0.9721525 0.9727132 0.9732926 0.9738533 0.9744327 #> trend[110] trend[111] trend[112] trend[113] trend[114] trend[115] #> [1,] 0.9750121 0.9755354 0.9761148 0.9766755 0.9772549 0.9778156 #> trend[116] trend[117] trend[118] trend[119] trend[120] trend[121] #> [1,] 0.978395 0.9789744 0.9791314 0.9792937 0.9794507 0.979613 #> trend[122] trend[123] trend[124] trend[125] trend[126] trend[127] #> [1,] 0.9797752 0.979927 0.9800893 0.9802463 0.9804085 0.9805656 #> trend[128] trend[129] trend[130] trend[131] trend[132] trend[133] #> [1,] 0.9807278 0.9808901 0.9810471 0.9812094 0.9813664 0.9815286 #> trend[134] trend[135] trend[136] trend[137] trend[138] trend[139] #> [1,] 0.9822477 0.9828972 0.9836162 0.9843121 0.9850312 0.985727 #> trend[140] trend[141] trend[142] trend[143] trend[144] trend[145] #> [1,] 0.9864461 0.9871652 0.9879263 0.9887128 0.989474 0.9902605 #> trend[146] trend[147] trend[148] trend[149] trend[150] trend[151] #> [1,] 0.991047 0.9917574 0.992544 0.9923895 0.9922299 0.9920754 #> trend[152] trend[153] trend[154] trend[155] trend[156] trend[157] #> [1,] 0.9919158 0.9917562 0.9916017 0.9914421 0.9912877 0.9911276 #> trend[158] trend[159] trend[160] trend[161] trend[162] trend[163] #> [1,] 0.9909675 0.9908229 0.9906628 0.9905079 0.9903478 0.9901929 #> trend[164] trend[165] trend[166] trend[167] trend[168] trend[169] #> [1,] 0.9900328 0.9900538 0.9900741 0.9900951 0.9901154 0.9901364 #> trend[170] trend[171] trend[172] trend[173] trend[174] trend[175] #> [1,] 0.9901573 0.990177 0.9901979 0.9902182 0.9902392 0.9902595 #> trend[176] trend[177] trend[178] trend[179] trend[180] trend[181] #> [1,] 0.9902805 0.9903015 0.9903218 0.9903428 0.990313 0.9902823 #> trend[182] trend[183] trend[184] trend[185] trend[186] trend[187] #> [1,] 0.9902515 0.9902238 0.990193 0.9901633 0.9901325 0.9901028 #> trend[188] trend[189] trend[190] trend[191] trend[192] trend[193] #> [1,] 0.9894901 0.9888775 0.9882846 0.9876719 0.987079 0.9864663 #> trend[194] trend[195] trend[196] trend[197] trend[198] trend[199] #> [1,] 0.9858536 0.9853003 0.9854976 0.9856886 0.985886 0.986077 #> trend[200] trend[201] trend[202] trend[203] trend[204] trend[205] #> [1,] 0.9862744 0.9864717 0.9866627 0.9868601 0.9870511 0.9872485 #> trend[206] trend[207] trend[208] trend[209] trend[210] trend[211] #> [1,] 0.9874458 0.9876241 0.9878215 0.9880125 0.9882098 0.9884008 #> trend[212] trend[213] trend[214] trend[215] trend[216] trend[217] #> [1,] 0.9885982 0.9887956 0.9889866 0.9891839 0.9893749 0.9895723 #> trend[218] trend[219] trend[220] trend[221] trend[222] trend[223] #> [1,] 0.9897697 0.9899543 0.9901517 0.9903427 0.99054 0.990731 #> trend[224] trend[225] trend[226] trend[227] trend[228] trend[229] #> [1,] 0.9909284 0.9911258 0.9913168 0.9915141 0.9917051 0.9919025 #> trend[230] trend[231] trend[232] trend[233] trend[234] trend[235] #> [1,] 0.9920999 0.9922781 0.9924755 0.9926665 0.9928638 0.9930548 #> trend[236] trend[237] trend[238] trend[239] trend[240] trend[241] #> [1,] 0.9932522 0.9934496 0.9936406 0.9938379 0.9940289 0.9942263 #> trend[242] trend[243] trend[244] #> [1,] 0.9944237 0.9946019 0.9947993 #> #> #> $params #> $params$k #> [1] 0.1334132 #> #> $params$m #> [1] 0.9415521 #> #> $params$delta #> [,1] [,2] [,3] [,4] [,5] [,6] #> [1,] -0.03069999 -0.03836996 -0.07255556 -0.007495696 0.05416064 0.009030884 #> [,7] [,8] [,9] [,10] [,11] [,12] #> [1,] 0.0054529 0.03015859 0.02952826 1.192383e-06 -0.02170273 3.001226e-09 #> [,13] [,14] [,15] [,16] [,17] [,18] #> [1,] 0.04729257 -2.482628e-10 -0.09950836 -2.4946e-09 0.1328259 0.01609158 #> [,19] [,20] [,21] [,22] [,23] [,24] #> [1,] -0.2256983 -0.0001135256 0.04319331 9.34853e-08 -0.0123381 -0.1388182 #> [,25] #> [1,] 0.1932328 #> #> $params$sigma_obs #> [1] 0.001990938 #> #> $params$beta #> [,1] [,2] [,3] [,4] [,5] [,6] #> [1,] 0.002554739 0.00482848 -0.002917912 -0.005395831 0.003730877 0.003594633 #> [,7] [,8] [,9] [,10] [,11] #> [1,] -0.004072556 -0.002253052 0.002758258 0.001950088 -0.004481242 #> [,12] [,13] [,14] [,15] [,16] #> [1,] -0.0002742276 0.002530024 -0.001214479 -0.003094824 0.001092083 #> [,17] [,18] [,19] [,20] #> [1,] 0.002433712 -0.002135713 -0.0003163449 0.002050093 #> #> $params$trend #> [1] 0.9415521 0.9421113 0.9426165 0.9431758 0.9437170 0.9442763 0.9448175 #> [8] 0.9453768 0.9459360 0.9463527 0.9467833 0.9472000 0.9476306 0.9480611 #> [15] 0.9484500 0.9488806 0.9492973 0.9495670 0.9498281 0.9500978 0.9503675 #> [22] 0.9506285 0.9508983 0.9511593 0.9511249 0.9510905 0.9510582 0.9510238 #> [29] 0.9509905 0.9509561 0.9509228 0.9508883 0.9508225 0.9507588 0.9506929 #> [36] 0.9506292 0.9505633 0.9504975 0.9504380 0.9503722 0.9505282 0.9506894 #> [43] 0.9508454 0.9510066 0.9511677 0.9513237 0.9514849 0.9516409 0.9518400 #> [50] 0.9520390 0.9522188 0.9524179 0.9526105 0.9528096 0.9530022 0.9532241 #> [57] 0.9534460 0.9536608 0.9538827 0.9540974 0.9543193 0.9545412 0.9547417 #> [64] 0.9550900 0.9554271 0.9557755 0.9561126 0.9564609 0.9568092 0.9571463 #> [71] 0.9574947 0.9579516 0.9584237 0.9588958 0.9593375 0.9598096 0.9602665 #> [78] 0.9607386 0.9611955 0.9616676 0.9621397 0.9625966 0.9630687 0.9635256 #> [85] 0.9639978 0.9644699 0.9648141 0.9651953 0.9655641 0.9659453 0.9663141 #> [92] 0.9666953 0.9670764 0.9674453 0.9678264 0.9681953 0.9685764 0.9689576 #> [99] 0.9693018 0.9696830 0.9700518 0.9704330 0.9709937 0.9715731 0.9721525 #> [106] 0.9727132 0.9732926 0.9738533 0.9744327 0.9750121 0.9755354 0.9761148 #> [113] 0.9766755 0.9772549 0.9778156 0.9783950 0.9789744 0.9791314 0.9792937 #> [120] 0.9794507 0.9796130 0.9797752 0.9799270 0.9800893 0.9802463 0.9804085 #> [127] 0.9805656 0.9807278 0.9808901 0.9810471 0.9812094 0.9813664 0.9815286 #> [134] 0.9822477 0.9828972 0.9836162 0.9843121 0.9850312 0.9857270 0.9864461 #> [141] 0.9871652 0.9879263 0.9887128 0.9894740 0.9902605 0.9910470 0.9917574 #> [148] 0.9925440 0.9923895 0.9922299 0.9920754 0.9919158 0.9917562 0.9916017 #> [155] 0.9914421 0.9912877 0.9911276 0.9909675 0.9908229 0.9906628 0.9905079 #> [162] 0.9903478 0.9901929 0.9900328 0.9900538 0.9900741 0.9900951 0.9901154 #> [169] 0.9901364 0.9901573 0.9901770 0.9901979 0.9902182 0.9902392 0.9902595 #> [176] 0.9902805 0.9903015 0.9903218 0.9903428 0.9903130 0.9902823 0.9902515 #> [183] 0.9902238 0.9901930 0.9901633 0.9901325 0.9901028 0.9894901 0.9888775 #> [190] 0.9882846 0.9876719 0.9870790 0.9864663 0.9858536 0.9853003 0.9854976 #> [197] 0.9856886 0.9858860 0.9860770 0.9862744 0.9864717 0.9866627 0.9868601 #> [204] 0.9870511 0.9872485 0.9874458 0.9876241 0.9878215 0.9880125 0.9882098 #> [211] 0.9884008 0.9885982 0.9887956 0.9889866 0.9891839 0.9893749 0.9895723 #> [218] 0.9897697 0.9899543 0.9901517 0.9903427 0.9905400 0.9907310 0.9909284 #> [225] 0.9911258 0.9913168 0.9915141 0.9917051 0.9919025 0.9920999 0.9922781 #> [232] 0.9924755 0.9926665 0.9928638 0.9930548 0.9932522 0.9934496 0.9936406 #> [239] 0.9938379 0.9940289 0.9942263 0.9944237 0.9946019 0.9947993 #> #> #> $history #> # A tibble: 244 x 5 #> y ds floor t y_scaled #> <dbl> <dttm> <dbl> <dbl> <dbl> #> 1 8.76 1990-01-01 00:00:00 0 0 0.943 #> 2 8.77 1990-02-01 00:00:00 0 0.00419 0.944 #> 3 8.78 1990-03-01 00:00:00 0 0.00798 0.945 #> 4 8.79 1990-04-01 00:00:00 0 0.0122 0.946 #> 5 8.80 1990-05-01 00:00:00 0 0.0162 0.947 #> 6 8.81 1990-06-01 00:00:00 0 0.0204 0.948 #> 7 8.70 1990-07-01 00:00:00 0 0.0245 0.937 #> 8 8.60 1990-08-01 00:00:00 0 0.0287 0.926 #> 9 8.78 1990-09-01 00:00:00 0 0.0329 0.945 #> 10 8.83 1990-10-01 00:00:00 0 0.0369 0.950 #> # ... with 234 more rows #> #> $history.dates #> [1] "1990-01-01 GMT" "1990-02-01 GMT" "1990-03-01 GMT" "1990-04-01 GMT" #> [5] "1990-05-01 GMT" "1990-06-01 GMT" "1990-07-01 GMT" "1990-08-01 GMT" #> [9] "1990-09-01 GMT" "1990-10-01 GMT" "1990-11-01 GMT" "1990-12-01 GMT" #> [13] "1991-01-01 GMT" "1991-02-01 GMT" "1991-03-01 GMT" "1991-04-01 GMT" #> [17] "1991-05-01 GMT" "1991-06-01 GMT" "1991-07-01 GMT" "1991-08-01 GMT" #> [21] "1991-09-01 GMT" "1991-10-01 GMT" "1991-11-01 GMT" "1991-12-01 GMT" #> [25] "1992-01-01 GMT" "1992-02-01 GMT" "1992-03-01 GMT" "1992-04-01 GMT" #> [29] "1992-05-01 GMT" "1992-06-01 GMT" "1992-07-01 GMT" "1992-08-01 GMT" #> [33] "1992-09-01 GMT" "1992-10-01 GMT" "1992-11-01 GMT" "1992-12-01 GMT" #> [37] "1993-01-01 GMT" "1993-02-01 GMT" "1993-03-01 GMT" "1993-04-01 GMT" #> [41] "1993-05-01 GMT" "1993-06-01 GMT" "1993-07-01 GMT" "1993-08-01 GMT" #> [45] "1993-09-01 GMT" "1993-10-01 GMT" "1993-11-01 GMT" "1993-12-01 GMT" #> [49] "1994-01-01 GMT" "1994-02-01 GMT" "1994-03-01 GMT" "1994-04-01 GMT" #> [53] "1994-05-01 GMT" "1994-06-01 GMT" "1994-07-01 GMT" "1994-08-01 GMT" #> [57] "1994-09-01 GMT" "1994-10-01 GMT" "1994-11-01 GMT" "1994-12-01 GMT" #> [61] "1995-01-01 GMT" "1995-02-01 GMT" "1995-03-01 GMT" "1995-04-01 GMT" #> [65] "1995-05-01 GMT" "1995-06-01 GMT" "1995-07-01 GMT" "1995-08-01 GMT" #> [69] "1995-09-01 GMT" "1995-10-01 GMT" "1995-11-01 GMT" "1995-12-01 GMT" #> [73] "1996-01-01 GMT" "1996-02-01 GMT" "1996-03-01 GMT" "1996-04-01 GMT" #> [77] "1996-05-01 GMT" "1996-06-01 GMT" "1996-07-01 GMT" "1996-08-01 GMT" #> [81] "1996-09-01 GMT" "1996-10-01 GMT" "1996-11-01 GMT" "1996-12-01 GMT" #> [85] "1997-01-01 GMT" "1997-02-01 GMT" "1997-03-01 GMT" "1997-04-01 GMT" #> [89] "1997-05-01 GMT" "1997-06-01 GMT" "1997-07-01 GMT" "1997-08-01 GMT" #> [93] "1997-09-01 GMT" "1997-10-01 GMT" "1997-11-01 GMT" "1997-12-01 GMT" #> [97] "1998-01-01 GMT" "1998-02-01 GMT" "1998-03-01 GMT" "1998-04-01 GMT" #> [101] "1998-05-01 GMT" "1998-06-01 GMT" "1998-07-01 GMT" "1998-08-01 GMT" #> [105] "1998-09-01 GMT" "1998-10-01 GMT" "1998-11-01 GMT" "1998-12-01 GMT" #> [109] "1999-01-01 GMT" "1999-02-01 GMT" "1999-03-01 GMT" "1999-04-01 GMT" #> [113] "1999-05-01 GMT" "1999-06-01 GMT" "1999-07-01 GMT" "1999-08-01 GMT" #> [117] "1999-09-01 GMT" "1999-10-01 GMT" "1999-11-01 GMT" "1999-12-01 GMT" #> [121] "2000-01-01 GMT" "2000-02-01 GMT" "2000-03-01 GMT" "2000-04-01 GMT" #> [125] "2000-05-01 GMT" "2000-06-01 GMT" "2000-07-01 GMT" "2000-08-01 GMT" #> [129] "2000-09-01 GMT" "2000-10-01 GMT" "2000-11-01 GMT" "2000-12-01 GMT" #> [133] "2001-01-01 GMT" "2001-02-01 GMT" "2001-03-01 GMT" "2001-04-01 GMT" #> [137] "2001-05-01 GMT" "2001-06-01 GMT" "2001-07-01 GMT" "2001-08-01 GMT" #> [141] "2001-09-01 GMT" "2001-10-01 GMT" "2001-11-01 GMT" "2001-12-01 GMT" #> [145] "2002-01-01 GMT" "2002-02-01 GMT" "2002-03-01 GMT" "2002-04-01 GMT" #> [149] "2002-05-01 GMT" "2002-06-01 GMT" "2002-07-01 GMT" "2002-08-01 GMT" #> [153] "2002-09-01 GMT" "2002-10-01 GMT" "2002-11-01 GMT" "2002-12-01 GMT" #> [157] "2003-01-01 GMT" "2003-02-01 GMT" "2003-03-01 GMT" "2003-04-01 GMT" #> [161] "2003-05-01 GMT" "2003-06-01 GMT" "2003-07-01 GMT" "2003-08-01 GMT" #> [165] "2003-09-01 GMT" "2003-10-01 GMT" "2003-11-01 GMT" "2003-12-01 GMT" #> [169] "2004-01-01 GMT" "2004-02-01 GMT" "2004-03-01 GMT" "2004-04-01 GMT" #> [173] "2004-05-01 GMT" "2004-06-01 GMT" "2004-07-01 GMT" "2004-08-01 GMT" #> [177] "2004-09-01 GMT" "2004-10-01 GMT" "2004-11-01 GMT" "2004-12-01 GMT" #> [181] "2005-01-01 GMT" "2005-02-01 GMT" "2005-03-01 GMT" "2005-04-01 GMT" #> [185] "2005-05-01 GMT" "2005-06-01 GMT" "2005-07-01 GMT" "2005-08-01 GMT" #> [189] "2005-09-01 GMT" "2005-10-01 GMT" "2005-11-01 GMT" "2005-12-01 GMT" #> [193] "2006-01-01 GMT" "2006-02-01 GMT" "2006-03-01 GMT" "2006-04-01 GMT" #> [197] "2006-05-01 GMT" "2006-06-01 GMT" "2006-07-01 GMT" "2006-08-01 GMT" #> [201] "2006-09-01 GMT" "2006-10-01 GMT" "2006-11-01 GMT" "2006-12-01 GMT" #> [205] "2007-01-01 GMT" "2007-02-01 GMT" "2007-03-01 GMT" "2007-04-01 GMT" #> [209] "2007-05-01 GMT" "2007-06-01 GMT" "2007-07-01 GMT" "2007-08-01 GMT" #> [213] "2007-09-01 GMT" "2007-10-01 GMT" "2007-11-01 GMT" "2007-12-01 GMT" #> [217] "2008-01-01 GMT" "2008-02-01 GMT" "2008-03-01 GMT" "2008-04-01 GMT" #> [221] "2008-05-01 GMT" "2008-06-01 GMT" "2008-07-01 GMT" "2008-08-01 GMT" #> [225] "2008-09-01 GMT" "2008-10-01 GMT" "2008-11-01 GMT" "2008-12-01 GMT" #> [229] "2009-01-01 GMT" "2009-02-01 GMT" "2009-03-01 GMT" "2009-04-01 GMT" #> [233] "2009-05-01 GMT" "2009-06-01 GMT" "2009-07-01 GMT" "2009-08-01 GMT" #> [237] "2009-09-01 GMT" "2009-10-01 GMT" "2009-11-01 GMT" "2009-12-01 GMT" #> [241] "2010-01-01 GMT" "2010-02-01 GMT" "2010-03-01 GMT" "2010-04-01 GMT" #> #> $train.holiday.names #> NULL #> #> $train.component.cols #> additive_terms yearly multiplicative_terms #> 1 1 1 0 #> 2 1 1 0 #> 3 1 1 0 #> 4 1 1 0 #> 5 1 1 0 #> 6 1 1 0 #> 7 1 1 0 #> 8 1 1 0 #> 9 1 1 0 #> 10 1 1 0 #> 11 1 1 0 #> 12 1 1 0 #> 13 1 1 0 #> 14 1 1 0 #> 15 1 1 0 #> 16 1 1 0 #> 17 1 1 0 #> 18 1 1 0 #> 19 1 1 0 #> 20 1 1 0 #> #> $component.modes #> $component.modes$additive #> [1] "yearly" "additive_terms" #> [3] "extra_regressors_additive" "holidays" #> #> $component.modes$multiplicative #> [1] "multiplicative_terms" "extra_regressors_multiplicative" #> #> #> $fit.kwargs #> list() #> #> attr(,"class") #> [1] "prophet" "list" #> #> --- #> Model 2: Catboost Errors #> #> CatBoost model (1000 trees) #> Loss function: RMSE #> Fit to 2 features