Pytorch-forecasting

Latest version: v1.0.0

Safety actively analyzes 621467 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 6

1.0.0

Breaking Changes

- Upgraded to pytorch 2.0 and lightning 2.0. This brings a couple of changes, such as configuration of trainers. See the [lightning upgrade guide](https://lightning.ai/docs/pytorch/latest/upgrade/migration_guide.html). For PyTorch Forecasting, this particularly means if you are developing own models, the class method `epoch_end` has been renamed to `on_epoch_end` and replacing `model.summarize()` with `ModelSummary(model, max_depth=-1)` and `Tuner(trainer)` is its own class, so `trainer.tuner` needs replacing. (#1280)
- Changed the `predict()` interface returning named tuple - see tutorials.

Changes

- The predict method is now using the lightning predict functionality and allows writing results to disk (1280).

Fixed

- Fixed robust scaler when quantiles are 0.0, and 1.0, i.e. minimum and maximum (1142)

0.10.3

Fixed

- Removed pandoc from dependencies as issue with poetry install (1126)
- Added metric attributes for torchmetric resulting in better multi-GPU performance (1126)

Added

- "robust" encoder method can be customized by setting "center", "lower" and "upper" quantiles (1126)

0.10.2

Added

- DeepVar network (923)
- Enable quantile loss for N-HiTS (926)
- MQF2 loss (multivariate quantile loss) (949)
- Non-causal attention for TFT (949)
- Tweedie loss (949)
- ImplicitQuantileNetworkDistributionLoss (995)

Fixed

- Fix learning scale schedule (912)
- Fix TFT list/tuple issue at interpretation (924)
- Allowed encoder length down to zero for EncoderNormalizer if transformation is not needed (949)
- Fix Aggregation and CompositeMetric resets (949)

Changed

- Dropping Python 3.6 suppport, adding 3.10 support (479)
- Refactored dataloader sampling - moved samplers to pytorch_forecasting.data.samplers module (479)
- Changed transformation format for Encoders to dict from tuple (949)

Contributors

- jdb78

0.10.1

Fixed

- Fix with creating tensors on correct devices (908)
- Fix with MultiLoss when calculating gradient (908)

Contributors

- jdb78

0.10.0

Added

- Added new `N-HiTS` network that has consistently beaten `N-BEATS` (890)
- Allow using [torchmetrics](https://torchmetrics.readthedocs.io/) as loss metrics (#776)
- Enable fitting `EncoderNormalizer()` with limited data history using `max_length` argument (782)
- More flexible `MultiEmbedding()` with convenience `output_size` and `input_size` properties (829)
- Fix concatentation of attention (902)

Fixed

- Fix pip install via github (798)

Contributors

- jdb78
- christy
- lukemerrick
- Seon82

0.9.2

Added

- Added support for running `lightning.trainer.test` (759)

Fixed

- Fix inattention mutation to `x_cont` (732).
- Compatability with pytorch-lightning 1.5 (758)

Contributors

- eavae
- danielgafni
- jdb78

Page 1 of 6

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.