Inference-tools

Latest version: v0.13.0

Safety actively analyzes 618849 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 4

0.13.0

Minimum python version increase
- The minimum python version for `inference-tools` has been increased to `3.9`.

Backwards-incompatible changes in `inference.mcmc`
- Specification of parameter bounds has been standardised across `HamiltonianChain`, `PcaChain` and `EnsembleSampler`. On initialisation they now each have an optional `bounds` keyword argument, which takes an instance of `inference.mcmc.Bounds` or a sequence of two `numpy.ndarray`. Depending on which sampler is being used, the name of the keyword argument and/or the format in which the bounds are specified may have changed in this release, which will break code written for previous releases.
- Standardised the naming of some instance attributes across the MCMC samplers. These attributes typically would not be accessed directly by a user, but the changes also affect how chain data are saved and loaded, meaning that chain data saved using previous releases may not load correctly.
- Standardised the returned type of the `get_probabilities`, `get_parameter` and `get_sample` methods of MCMC samplers to be a `numpy.ndarray`. Previously, these methods returned either a list or a `numpy.ndarray` depending on which sampler was used.
- Removed the `burn` and `thin` instance attributes of MCMC samplers used to set global burn and thin values. This means burn and thin values must now be passed explicitly to `MarkovChain` methods, but avoids potentially error-prone behavior of burning / thinning being applied implicitly even when the `burn` and `thin` kwargs are not specified. `burn` and `thin` have been added to the `MarkovChain` base class as properties which raise an error when accessed or set. This will prevent bugs where the burn / thin attributes are set but have no effect as they are no longer used by any methods.

Other changes
- Converted `MarkovChain` to an abstract base-class for MCMC samplers. All sampler classes now inherit from `MarkovChain`, so it can be used for type checking / hinting when expecting one of the samplers as an input. Some additional standardisation across the samplers has allowed more functionality to be moved from individual classes to the base class, improving code re-use.
- Improved type-hinting across various modules.
- All project meta-data has been moved to the `pyproject.toml`, and `setup.cfg` has been removed.

0.12.0

- Added a new module `inference.approx` for approximate inference tools.
- Currently contains the `get_conditionals` and `conditional_sample` functions for evaluating and sampling from conditional distributions.

0.11.0

- Improved numerical efficiency of the leapfrog update in `HamiltonianChain`.
- Fixed some errors so that mass-scaling in `HamiltonianChain` now works correctly, and renamed the `inv_mass` keyword argument to `inverse_mass`.
- Improved input validation for `EnsembleSampler`.
- The `inference.gp` and `inference.mcmc` modules were becoming too large, and have now been refactored into sub-packages.
- Support for importing from the old module names `inference.gp_tools` and `inference.pdf_tools` has been removed. The current names `inference.gp` and `inference.pdf` should be used instead.

0.10.0

- Added the `HeteroscedasticNoise` covariance kernel to the `inference.covariance` module, which allows for the noise variance on each data-point to be varied independently.

0.9.2

- Fixed a bug where the `WhiteNoise` kernel would cause a crash when working with data having more than one dimension.

0.9.1

- Fixed a bug in the `ChangePoint` covariance kernel which was causing `GpRegressor` to incorrectly assess the number of covariance hyper-parameters, and subsequently crash during hyper-parameter optimisation.

Page 1 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.