Pyro-ppl

Latest version: v1.9.0

Safety actively analyzes 626643 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 6

1.1.0

New Features
- [pyro.infer.ReweightedWakeSleep](http://docs.pyro.ai/en/latest/inference_algos.html#pyro.infer.rws.ReweightedWakeSleep) implements the Reweighted Wake Sleep algorithm [(Le et al. 2019)](https://arxiv.org/abs/1805.10469). Contributed by [Siddharth Narayanaswamy](https://github.com/iffsid) and [Tuan Anh Le](https://github.com/tuananhle7).
- [pyro.infer.TraceTMC_ELBO](http://docs.pyro.ai/en/latest/inference_algos.html#pyro.infer.tracetmc_elbo.TraceTMC_ELBO) implements the Tensor Monte Carlo marginal likelihood estimator [(Aitchinson 2019)](https://papers.nips.cc/paper/8936-tensor-monte-carlo-particle-methods-for-the-gpu-era.pdf), a generalization of the importance-weighted autoencoder objective.
- [pyro.infer.EnergyDistance](http://docs.pyro.ai/en/latest/inference_algos.html#pyro.infer.energy_distance.EnergyDistance) implements a likelihood-free inference algorithm based on Szekely's energy statistics, a multidimensional generalization of CRPS [(Gneiting & Raftery 2007)](https://www.stat.washington.edu/raftery/Research/PDF/Gneiting2007jasa.pdf).
- [pyro.contrib.cevae]() implements the Causal Inference VAE of [(Louizos et al. 2017)](http://papers.nips.cc/paper/7223-causal-effect-inference-with-deep-latent-variable-models.pdf). See [examples/contrib/cevae/synthetic.py](https://github.com/pyro-ppl/pyro/blob/dev/examples/contrib/cevae/synthetic.py) for an end-to-end usage example.
- [pyro.deterministic](http://docs.pyro.ai/en/latest/primitives.html#pyro.deterministic) primitive to record deterministic values in the trace.
- [pyro.nn.to_pyro_module_()](http://docs.pyro.ai/en/latest/nn.html#pyro.nn.module.to_pyro_module_) recursively converts an regular `nn.Module` to a [PyroModule](http://docs.pyro.ai/en/latest/nn.html#pyro.nn.module.PyroModule) in-place.

- A default implementation for [Distribution.expand()](http://docs.pyro.ai/en/latest/distributions.html#pyro.distributions.torch_distribution.TorchDistributionMixin.expand) that is available to all Pyro distributions that subclass from `TorchDistribution`, making it easier to create custom distributions.

New distributions and transforms
- [MultivariateStudentT](http://docs.pyro.ai/en/latest/distributions.html#multivariatestudentt) is a heavy-tailed multivariate distribution.
- [Stable](http://docs.pyro.ai/en/latest/distributions.html#stable) implements a Lévy α-stable distribution with reparametrized `.rsample()` method but no `.log_prob()`. This can be fit using [EnergyDistance](http://docs.pyro.ai/en/latest/inference_algos.html#pyro.infer.energy_distance.EnergyDistance) inference.
- [ZeroInflatedNegativeBinomial](http://docs.pyro.ai/en/latest/distributions.html#zeroinflatednegativebinomial) is a distribution for count data.
- [LowerCholeskyAffine](http://docs.pyro.ai/en/latest/distributions.html#lowercholeskyaffine) is a multivariate affine transform.


Other Changes / Bug Fixes
- `pyro.util.save_visualization` has been deprecated, and dependency on `graphviz` is removed.
- [2197](https://github.com/pyro-ppl/pyro/pull/2197) fixed a naming bug in [PyroModule](http://docs.pyro.ai/en/latest/nn.html#pyro.nn.module.PyroModule) that affected mutliple sub-PyroModules with conflicting names.
- [2192](https://github.com/pyro-ppl/pyro/pull/2192) Bug fix in Planar normalizing flow implementation
- [2188](https://github.com/pyro-ppl/pyro/issues/2188) Make error messages for incorrect arguments to effect handlers more informative

1.0.0

The objective of this release is to stabilize Pyro's interface and thereby make it safer to build high level components on top of Pyro.

Stability statement
- Behavior of documented APIs will remain stable across minor releases, except for bug fixes and features marked EXPERIMENTAL or DEPRECATED.
- Serialization formats will remain stable across patch releases, but may change across minor releases (e.g. if you save a model in 1.0.0, it will be safe to load it in 1.0.1, but not in 1.1.0).
- Undocumented APIs, features marked EXPERIMENTAL or DEPRECATED, and anything in`pyro.contrib` may change at any time (though we aim for stability).
- All deprecated features throw a `FutureWarning` and specify possible work-arounds. Features marked as deprecated will not be maintained, and are likely to be removed in a future release.
- If you want more stability for a particular feature, [contribute](https://github.com/pyro-ppl/pyro/blob/dev/CONTRIBUTING.md) a unit test.

New features
- [pyro.infer.Predictive](http://docs.pyro.ai/en/1.0.0/inference_algos.html#pyro.infer.predictive.Predictive) is a new utility for serving models, supporting jit tracing and serialization.
- [pyro.distributions.transforms](http://docs.pyro.ai/en/1.0.0/distributions.html#transforms) has many new transforms, and includes helper functions to easily create a variety of normalizing flows. The transforms library has also been reorganized.
- [pyro.contrib.timeseries](http://docs.pyro.ai/en/1.0.0/contrib.timeseries.html) is an experimental new module with fast Gaussian Process inference for univariate and multivariate time series and state space models.
- [pyro.nn.PyroModule](http://docs.pyro.ai/en/1.0.0/nn.html#module-pyro.nn.module) is an experimental new interface that adds Pyro effects to an `nn.Module`. `PyroModule` is already used internally by `AutoGuide`, `EasyGuide` `pyro.contrib.gp`, `pyro.contrib.timeseries`, and elsewhere.
- [FoldedDistribution]() is a new distribution factory, essentially equivalent to `TransformedDistribution(-, AbsTransform())` but providing a `.log_prob()` method.
- A new tutorial illustrates the usage of [pyro.contrib.oed](http://docs.pyro.ai/en/1.0.0/contrib.oed.html) in the context of adaptive election polling.

Breaking changes

- Autoguides have slightly changed interfaces:
- `AutoGuide` and `EasyGuide` are now `nn.Module`s and can be serialized separately from the param store. This enables serving via [torch.jit.trace_module](https://pytorch.org/docs/1.0.0/jit.html#torch.jit.trace_module).
- The `Auto*Normal` family of autoguides now have `init_scale` arguments, and `init_loc_fn` has better support. Autoguides no longer support initialization by writing directly to the param store.
- Many transforms have been renamed to enforce a consistent interface, such as the renaming of `InverseAutoregressiveFlow` to `AffineAutoregressive`.
- `pyro.generic` has been moved to a separate project [pyroapi](https://github.com/pyro-ppl/pyro-api).
- [poutine.do](http://docs.pyro.ai/en/1.0.0/poutine.html#pyro.poutine.do) has slightly changed semantics to follow Single World Intervention Graph semantics.
- `pyro.contrib.glmm` has been moved to `pyro.contrib.oed.glmm` and will eventually be replaced by [BRMP](https://github.com/pyro-ppl/brmp).
- Existing `DeprecationWarning`s have been promoted to `FutureWarning`s.

Deprecated features
- `pyro.random_module`: The `pyro.random_module` primitive has been deprecated in favor of [PyroModule](http://docs.pyro.ai/en/1.0.0/nn.html#pyro.nn.module.PyroModule) which can be used to create Bayesian modules from `torch.nn.Module` instances.
- `SVI.run`: The `SVI.run` method is deprecated and users are encouraged to use the [.step](http://docs.pyro.ai/en/1.0.0/inference_algos.html#pyro.infer.svi.SVI.step) method directly to run inference. For drawing samples from the posterior distribution, we recommend using the [Predictive](http://docs.pyro.ai/en/1.0.0/inference_algos.html#module-pyro.infer.predictive.Predictive) utility class, or directly by using the `trace` and `replay` effect handlers.
- `TracePredictive`: The `TracePredictive` class is deprecated in favor of [Predictive](http://docs.pyro.ai/en/1.0.0/inference_algos.html#module-pyro.infer.predictive.Predictive), that can be used to gather samples from the posterior and predictive distributions in SVI and MCMC.
- `mcmc.predictive`: This utility function has been absorbed into the more general [Predictive](http://docs.pyro.ai/en/1.0.0/inference_algos.html#module-pyro.infer.predictive.Predictive) class.

0.5.1

Patches `0.5.0` with the following bug fixes:
- Removes f-string which is only supported in Python 3.6+, so that Python 3.5 is supported.
- Fix incompatibility with recent tqdm releases which make multiple bars not work in the notebook environment (for MCMC with multiple chains).

0.5.0

New features

- [pyro.factor](http://docs.pyro.ai/en/stable/primitives.html#pyro.factor) to add arbitrary log probability factor to a probabilistic model.
- Conditional MADE Autoregressive Network available in [pyro.nn](http://docs.pyro.ai/en/stable/nn.html#pyro.nn.auto_reg_nn.ConditionalAutoRegressiveNN).
- Tutorial on [adaptive experiment design](http://pyro.ai/examples/working_memory.html) for studying working memory.
- KL divergence for `Delta` and `Independent` distributions.
- A fast `n log(n)` implementation of the [Continuous Ranked Probability Score (CRPS)](https://www.stat.washington.edu/raftery/Research/PDF/Gneiting2007jasa.pdf) for sample sets: [pyro.ops.stats.crps_empirical](http://docs.pyro.ai/en/dev/ops.html#pyro.ops.stats.crps_empirical)



Code changes and bug fixes

- Moved `pyro.generic` to a separate [pyro-api](https://github.com/pyro-ppl/pyro-api) package.
- Minor changes to ensure compatibility with [pyro-api](https://github.com/pyro-ppl/pyro-api), a generic modeling and inference API for dispatch to different Pyro backends.
- Improve numerical stability of MixtureOfDiagonals distribution using `logsumexp` operation.
- Improved U-Turn check condition in NUTS for better sampling efficiency.
- Reorganized `constraints` and `transforms` module to match `torch.distributions`.
- Fixed AutoGuide intitialization stragtegies, resolving a bug in `init_to_median`.

0.4.1

**New Features:**

- [*HMM.filter()](http://docs.pyro.ai/en/dev/distributions.html#pyro.distributions.DiscreteHMM.filter) methods for forecasting.
- Support for Independent(Normal) observations in GaussianHMM.

**Fixes:**
- Fix for HMC / NUTS to handle errors arising from numerical issues when computing Cholesky decomposition.

0.4

Pyro 0.2 supports PyTorch 0.4. See PyTorch [release notes](https://github.com/pytorch/pytorch/releases/tag/v0.4.0) for comprehensive changes. The most important change is that `Variable` and `Tensor` have been merged, so you can now simplify
diff
- pyro.param("my_param", Variable(torch.ones(1), requires_grad=True))
+ pyro.param("my_param", torch.ones(1))


PyTorch distributions

PyTorch's [torch.distributions](http://pytorch.org/docs/0.4.0/distributions.html) library is now Pyro’s main source for distribution implementations. The Pyro team helped create this library by collaborating with Adam Paszke, Alican Bozkurt, Vishwak Srinivasan, Rachit Singh, Brooks Paige, Jan-Willem Van De Meent, and many other contributors and reviewers. See the [Pyro wrapper docs](http://pyro-ppl.readthedocs.io/en/0.2.0-release/distributions.html#pytorch-distributions) for wrapped PyTorch distributions and the [Pyro distribution docs](http://pyro-ppl.readthedocs.io/en/0.2.0-release/distributions.html#pyro-distributions) for Pyro-specific distributions.

Constrained parameters

Parameters can now be constrained easily using notation like
python
from torch.distributions import constraints

pyro.param(“sigma”, torch.ones(10), constraint=constraints.positive)

See the [torch.distributions.constraints](http://pytorch.org/docs/0.4.0/distributions.html#module-torch.distributions.constraints) library and all of our Pyro [tutorials](http://pyro.ai/examples) for example usage.

Arbitrary tensor shapes

Arbitrary tensor shapes and batching are now supported in Pyro. This includes support for nested batching via `iarange` and support for batched multivariate distributions. The `iarange` context and `irange` generator are now much more flexible and can be combined freely. With power comes complexity, so check out our [tensor shapes tutorial](http://pyro.ai/examples/tensor_shapes.html) (hint: you’ll need to use [`.expand_by()`](http://pyro-ppl.readthedocs.io/en/0.2.0-release/distributions.html#pyro.distributions.TorchDistributionMixin.expand_by) and [`.independent()`](http://pyro-ppl.readthedocs.io/en/0.2.0-release/distributions.html#pyro.distributions.TorchDistributionMixin.independent)).

Parallel enumeration

Discrete enumeration can now be parallelized. This makes it especially easy and cheap to enumerate out discrete latent variables. Check out the [Gaussian Mixture Model tutorial](http://pyro.ai/examples/gmm.html) for example usage. To use parallel enumeration, you'll need to first configure sites, then use the `TraceEnum_ELBO` losss:
python
def model(...):
...

config_enumerate(default="parallel") configures sites
def guide(...):
with pyro.iarange("foo", 10):
x = pyro.sample("x", dist.Bernoulli(0.5).expand_by([10]))
...

svi = SVI(model, guide, Adam({}),
loss=TraceEnum_ELBO(max_iarange_nesting=1)) specify loss
svi.step()


Markov chain monte carlo via HMC and NUTS

This release adds experimental support for gradient-based Markov Chain Monte Carlo inference via Hamiltonian Monte Carlo [`pyro.infer.HMC`](http://pyro-ppl.readthedocs.io/en/0.2.0-release/mcmc.html#module-pyro.infer.mcmc.hmc) and the No U-Turn Sampler [`pyro.infer.NUTS`](http://pyro-ppl.readthedocs.io/en/0.2.0-release/mcmc.html#module-pyro.infer.mcmc.nuts). See the [docs](http://pyro-ppl.readthedocs.io/en/0.2.0-release/mcmc.html) and [example](https://github.com/uber/pyro/blob/dev/examples/baseball.py) for details.

Gaussian Processes

A new Gaussian Process module [pyro.contrib.gp](http://pyro-ppl.readthedocs.io/en/0.2.0-release/contrib.gp.html) provides a framework for learning with Gaussian Processes. To get started, take a look at our [Gaussian Process Tutorial](http://pyro.ai/examples/gp.html). Thanks to Du Phan for this extensive contribution!

Automatic guide generation

Guides can now be created automatically with the [pyro.contrib.autoguide](http://pyro-ppl.readthedocs.io/en/0.2.0-release/contrib.autoguide.html) library. These work only for models with simple structure (no `irange` or `iarange`), and are easy to use:
python
from pyro.contrib.autoguide import AutoDiagNormal
def model(...):
...

guide = AutoDiagonalNormal(model)
svi = SVI(model, guide, ...)


Validation

Model validation is now available via three toggles:
python
pyro.enable_validation()
pyro.infer.enable_validation()
Turns on validation for PyTorch distributions.
pyro.distributions.enable_validation()

These can also be used temporarily as context managers
python
Run with validation in first step.
with pyro.validation_enabled(True):
svi.step()
Avoid validation on subsequent steps (may miss NAN errors).
with pyro.validation_enabled(False):
for i in range(1000):
svi.step()


Rejection sampling variational inference (RSVI)

We've added support for vectorized rejection sampling in a new `Rejector` distribution. See [docs](http://pyro-ppl.readthedocs.io/en/0.2.0-release/distributions.html#pyro.distributions.Rejector) or [`RejectionStandardGamma` class](https://github.com/uber/pyro/blob/0.4.0/pyro/distributions/testing/rejection_gamma.py#L12) for example usage.

Page 4 of 6

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.