etna.models.nn.ChronosBoltModel#
- class ChronosBoltModel(path_or_url: str, encoder_length: int = 2048, device: str = 'cpu', dtype: dtype = torch.float32, limit_prediction_length: bool = False, batch_size: int = 128, cache_dir: Path = PosixPath('/home/runner/.etna/chronos-models/chronos-bolt'))[source]#
Bases:
ChronosBaseModel
Class for pretrained chronos-bolt models.
This model is only for zero-shot forecasting: it doesn’t support training on data during
fit
.Official implementation: amazon-science/chronos-forecasting
Note
This model requires
chronos
extension to be installed. Read more about this at installation page.Init Chronos Bolt model.
- Parameters:
path_or_url (str) –
Path to the model. It can be huggingface repository, local path or external url.
If huggingface repository, the available models are:
’amazon/chronos-bolt-tiny’
’amazon/chronos-bolt-mini’
’amazon/chronos-bolt-small’
’amazon/chronos-bolt-base’.
During the first initialization model is downloaded from huggingface and saved to local
cache_dir
. All following initializations model will be loaded fromcache_dir
. Seepretrained_model_name_or_path
parameter oftransformers.PreTrainedModel.from_pretrained()
.If local path, model will not be saved to local
cache_dir
.If external url, it must be zip archive with the same name as model directory inside. Model will be downloaded to
cache_dir
.
encoder_length (int) – Number of last timestamps to use as a context.
device (str) – Device type. See
device_map
parameter oftransformers.PreTrainedModel.from_pretrained()
.dtype (dtype) – Torch dtype of computation. See
torch_dtype
parameter oftransformers.PreTrainedModel.from_pretrained()
.limit_prediction_length (bool) – Whether to cancel prediction if prediction_length is greater that built-in prediction length from the model.
batch_size (int) – Batch size. It can be useful when inference is done on gpu.
cache_dir (Path) – Local path to save model from huggingface during first model initialization. All following class initializations appropriate model version will be downloaded from this path. See
cache_dir
parameter oftransformers.PreTrainedModel.from_pretrained()
.
Methods
fit
(ts)Fit model.
forecast
(ts, prediction_size[, ...])Make autoregressive forecasts.
Get model.
Return a list of available pretrained chronos-bolt models.
load
(path)Load the model.
Get default grid for tuning hyperparameters.
predict
(ts, prediction_size[, ...])Make predictions using true values as autoregression context (teacher forcing).
save
(path)Save the model.
set_params
(**params)Return new object instance with modified parameters.
to_dict
()Collect all information about etna object in dict.
Attributes
This class stores its
__init__
parameters as attributes.Context size for model.
- fit(ts: TSDataset)[source]#
Fit model.
For this model, fit does nothing.
- Parameters:
ts (TSDataset) – Dataset with features.
- Returns:
Model after fit
- forecast(ts: TSDataset, prediction_size: int, prediction_interval: bool = False, quantiles: Sequence[float] = (0.025, 0.975), return_components: bool = False) TSDataset [source]#
Make autoregressive forecasts.
- Parameters:
ts (TSDataset) – Dataset with features.
prediction_size (int) – Number of last timestamps to leave after making prediction. Previous timestamps will be used as a context.
prediction_interval (bool) – If True returns prediction interval for forecast.
quantiles (Sequence[float]) – Levels of prediction distribution. By default 2.5% and 97.5% are taken to form a 95% prediction interval.
return_components (bool) – If True additionally returns forecast components.
- Returns:
Dataset with predictions.
- Return type:
- get_model() ChronosModelForForecasting | ChronosBoltModelForForecasting [source]#
Get model.
- Return type:
ChronosModelForForecasting | ChronosBoltModelForForecasting
- classmethod load(path: Path)[source]#
Load the model.
- Parameters:
path (Path) – Path to load object from.
- params_to_tune() Dict[str, BaseDistribution] [source]#
Get default grid for tuning hyperparameters.
This grid is empty.
- Returns:
Grid to tune.
- Return type:
- predict(ts: TSDataset, prediction_size: int, prediction_interval: bool = False, quantiles: Sequence[float] = (0.025, 0.975), return_components: bool = False) TSDataset [source]#
Make predictions using true values as autoregression context (teacher forcing).
- Parameters:
ts (TSDataset) – Dataset with features.
prediction_size (int) – Number of last timestamps to leave after making prediction. Previous timestamps will be used as a context.
prediction_interval (bool) – If True returns prediction interval for forecast.
quantiles (Sequence[float]) – Levels of prediction distribution. By default 2.5% and 97.5% are taken to form a 95% prediction interval.
return_components (bool) – If True additionally returns forecast components.
- Returns:
Dataset with predictions.
- Return type:
- save(path: Path)[source]#
Save the model. This method doesn’t save model’s weights.
During
load
weights are loaded from the path where they were saved duringinit
- Parameters:
path (Path) – Path to save object to.
- set_params(**params: dict) Self [source]#
Return new object instance with modified parameters.
Method also allows to change parameters of nested objects within the current object. For example, it is possible to change parameters of a
model
in aPipeline
.Nested parameters are expected to be in a
<component_1>.<...>.<parameter>
form, where components are separated by a dot.- Parameters:
**params (dict) – Estimator parameters
- Returns:
New instance with changed parameters
- Return type:
Self
Examples
>>> from etna.pipeline import Pipeline >>> from etna.models import NaiveModel >>> from etna.transforms import AddConstTransform >>> model = NaiveModel(lag=1) >>> transforms = [AddConstTransform(in_column="target", value=1)] >>> pipeline = Pipeline(model, transforms=transforms, horizon=3) >>> pipeline.set_params(**{"model.lag": 3, "transforms.0.value": 2}) Pipeline(model = NaiveModel(lag = 3, ), transforms = [AddConstTransform(in_column = 'target', value = 2, inplace = True, out_column = None, )], horizon = 3, )