R/parsnip-gen_additive_reg.R
gen_additive_reg.Rd
Interface for Generalized Additive Models (GAM)
gen_additive_reg( mode = "regression", markov_chains = NULL, chain_iter = NULL, warmup_iter = NULL, adapt_delta = NULL )
mode | A single character string for the type of model. |
---|---|
markov_chains | Number of Markov chains (defaults to 4). |
chain_iter | Number of total iterations per chain (including warmup; defaults to 2000). |
warmup_iter | A positive integer specifying number of warmup (aka burnin) iterations. This also specifies the number of iterations used for stepsize adaptation, so warmup samples should not be used for inference. The number of warmup should not be larger than iter and the default is iter/2. |
adapt_delta | The thin of the jumps in a HMC method. |
A parsnip
model specification
A model spec
Available Engines:
stan: Connects to brms::brm()
stan
This engine uses brms::brm()
and has the following parameters,
which can be modified through the parsnip::set_engine()
function.
## function (formula, data, family = gaussian(), prior = NULL, autocor = NULL, data2 = NULL, cov_ranef = NULL, sample_prior = "no", ## sparse = NULL, knots = NULL, stanvars = NULL, stan_funs = NULL, fit = NA, save_pars = NULL, save_ranef = NULL, save_mevars = NULL, ## save_all_pars = NULL, inits = "random", chains = 4, iter = 2000, warmup = floor(iter/2), thin = 1, cores = getOption("mc.cores", ## 1), threads = NULL, normalize = getOption("brms.normalize", TRUE), control = NULL, algorithm = getOption("brms.algorithm", ## "sampling"), backend = getOption("brms.backend", "rstan"), future = getOption("future", FALSE), silent = 1, seed = NA, save_model = NULL, ## stan_model_args = list(), file = NULL, file_refit = "never", empty = FALSE, rename = TRUE, ...)
BRMS Formula Interface
Fitting GAMs is accomplished using parameters including:
brms::s()
: GAM spline smooths
brms::t2()
: GAM tensor product smooths
These are applied in the fit()
function:
fit(value ~ s(date_mon, k = 12) + s(date_num), data = df)
if (FALSE) { library(tidymodels) library(bayesmodels) library(modeltime) library(tidyverse) library(timetk) library(lubridate) m750_extended <- m750 %>% group_by(id) %>% future_frame(.length_out = 24, .bind_data = TRUE) %>% mutate(lag_24 = lag(value, 24)) %>% ungroup() %>% mutate(date_num = as.numeric(date)) %>% mutate(date_month = month(date)) m750_train <- m750_extended %>% drop_na() m750_future <- m750_extended %>% filter(is.na(value)) model_fit_gam <- gen_additive_reg(mode = "regression", markov_chains = 2) %>% set_engine("stan", family=Gamma(link="log")) %>% fit(value ~ date + s(date_month, k = 12) + s(lag_24), data = m750_train) }