Pytorch-forecasting

Latest version: v1.0.0

Safety actively analyzes 629788 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 6

0.8.1

Added

- Add "Release Notes" section to docs (237)
- Enable usage of lag variables for any model (252)

Changed

- Require PyTorch>=1.7 (245)

Fixed

- Fix issue for multi-target forecasting when decoder length varies in single batch (249)
- Enable longer subsequences for min_prediction_idx that were previously wrongfully excluded (250)

Contributors

- jdb78

---

0.8.0

Added

- Adding support for multiple targets in the TimeSeriesDataSet (199) and amended tutorials.
- Temporal fusion transformer and DeepAR with support for multiple tagets (199)
- Check for non-finite values in TimeSeriesDataSet and better validate scaler argument (220)
- LSTM and GRU implementations that can handle zero-length sequences (235)
- Helpers for implementing auto-regressive models (236)

Changed

- TimeSeriesDataSet's `y` of the dataloader is a tuple of (target(s), weight) - potentially breaking for model or metrics implementation
Most implementations will not be affected as hooks in BaseModel and MultiHorizonMetric were modified. (199)

Fixed

- Fixed autocorrelation for pytorch 1.7 (220)
- Ensure reproducibility by replacing python `set()` with `dict.fromkeys()` (mostly TimeSeriesDataSet) (221)
- Ensures BetaDistributionLoss does not lead to infinite loss if actuals are 0 or 1 (233)
- Fix for GroupNormalizer if scaling by group (223)
- Fix for TimeSeriesDataSet when using `min_prediction_idx` (226)

Contributors

- jdb78
- JustinNeumann
- reumar
- rustyconover

---

0.7.1

Added

- Tutorial on how to implement a new architecture covering basic and advanced use cases (188)
- Additional and improved documentation - particularly of implementation details (188)

Changed (breaking for new model implementations)

- Moved multiple private methods to public methods (particularly logging) (188)
- Moved `get_mask` method from BaseModel into utils module (188)
- Instead of using label to communicate if model is training or validating, using `self.training` attribute (188)
- Using `sample((n,))` of pytorch distributions instead of deprecated `sample_n(n)` method (188)

---

0.7.0

Added

- Beta distribution loss for probabilistic models such as DeepAR (160)

Changed

- BREAKING: Simplifying how to apply transforms (such as logit or log) before and after applying encoder. Some transformations are included by default but a tuple of a forward and reverse transform function can be passed for arbitrary transformations. This requires to use a `transformation` keyword in target normalizers instead of, e.g. `log_scale` (185)

Fixed

- Incorrect target position if `len(static_reals) > 0` leading to leakage (184)
- Fixing predicting completely unseen series (172)

Contributors

- jdb78
- JakeForsey

---

0.6.1

Added

- Using GRU cells with DeepAR (153)

Fixed

- GPU fix for variable sequence length (169)
- Fix incorrect syntax for warning when removing series (167)
- Fix issue when using unknown group ids in validation or test dataset (172)
- Run non-failing CI on PRs from forks (166, 156)

Docs

- Improved model selection guidance and explanations on how TimeSeriesDataSet works (148)
- Clarify how to use with conda (168)

Contributors

- jdb78
- JakeForsey

---

0.6.0

Added

- DeepAR by Amazon (115)
- First autoregressive model in PyTorch Forecasting
- Distribution loss: normal, negative binomial and log-normal distributions
- Currently missing: handling lag variables and tutorial (planned for 0.6.1)
- Improved documentation on TimeSeriesDataSet and how to implement a new network (145)

Changed

- Internals of encoders and how they store center and scale (115)

Fixed

- Update to PyTorch 1.7 and PyTorch Lightning 1.0.5 which came with breaking changes for CUDA handling and with optimizers (PyTorch Forecasting Ranger version) (143, 137, 115)

Contributors

- jdb78
- JakeForesey

---

Page 3 of 6

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.