Since this release, NumPyro can be installed along with the latest `jax` and `jaxlib` releases (their version restrictions have been relaxed). In addition, NumPyro will use the default JAX platform so if you installed JAX with GPU/TPU support, their devices will be used by default.
New Features
- New distributions: [SoftLaplace](http://num.pyro.ai/en/latest/distributions.html#numpyro.distributions.continuous.SoftLaplace), [Weibull](http://num.pyro.ai/en/latest/distributions.html#numpyro.distributions.continuous.Weibull), [BetaProportion](http://num.pyro.ai/en/latest/distributions.html#numpyro.distributions.continuous.BetaProportion), [NegativeBinomial](http://num.pyro.ai/en/latest/distributions.html#numpyro.distributions.conjugate.NegativeBinomial), [NegativeBinomial2](http://num.pyro.ai/en/latest/distributions.html#numpyro.distributions.conjugate.NegativeBinomial2), [ZeroInflatedDistribution](http://num.pyro.ai/en/latest/distributions.html#numpyro.distributions.discrete.ZeroInflatedDistribution), [ZeroInflatedPoisson](http://num.pyro.ai/en/latest/distributions.html#numpyro.distributions.discrete.ZeroInflatedPoisson), [ZeroInflatedNegativeBinomial2](http://num.pyro.ai/en/latest/distributions.html#numpyro.distributions.conjugate.ZeroInflatedNegativeBinomial2), [FoldedDistribution](http://num.pyro.ai/en/latest/distributions.html#numpyro.distributions.distribution.FoldedDistribution)
- Support for DeepMind's [Optax](https://github.com/deepmind/optax) optimizers in [SVI](https://num.pyro.ai/en/latest/svi.html#numpyro.infer.svi.SVI)
- New ELBO objective: [TraceGraph_ELBO](http://num.pyro.ai/en/latest/svi.html#numpyro.infer.elbo.TraceGraph_ELBO) for non-reparameterized latent variables (e.g. discrete latent variables)
- A new wrapper [NestedSampler](https://num.pyro.ai/en/latest/contrib.html#numpyro.contrib.nested_sampling.NestedSampler) to leverage the nested sampling package [jaxns](https://github.com/Joshuaalbert/jaxns) for NumPyro models
- Implement `cdf` and `icdf` methods for many distributions
- New [cond](http://num.pyro.ai/en/latest/primitives.html#numpyro.contrib.control_flow.cond) primitive.
- New [infer_discrete](http://num.pyro.ai/en/latest/funsor.html#numpyro.contrib.funsor.discrete.infer_discrete) handler to sample discrete sites under enumeration. Check out the [annotation example](http://num.pyro.ai/en/latest/examples/annotation.html) for a usage.
- Structural mass matrix can be specified via [dense_mass](http://num.pyro.ai/en/latest/mcmc.html#numpyro.infer.hmc.NUTS) argument of the HMC/NUTS constructor 963
- New examples:
+ [Thompson sampling for Bayesian optimization with GPs](http://num.pyro.ai/en/latest/examples/thompson_sampling.html)
+ [Latent Dirichlet Allocation for topic modeling](http://num.pyro.ai/en/latest/examples/prodlda.html): a great example to illustrate the usage of Flax/Haiku in NumPyro
Enhancements and Bug Fixes
- Documentation and examples are greatly enhanced to make features more accessible
- Fix chain detection for various CPU device strings 1077
- Fix AutoNormal's `quantiles` method for models with non-scalar latent sites 1066
- Fix LocScaleReparam with `center=1` 1059
- Enhance auto guides to support models with deterministic sites 1022
- Support for mutable states in Flax and Haiku modules 1016
- Fix a bug in auto guides that happens when using the guide in Predictive 1013
- Support decorator syntax for effect handlers 1009
- Implement sparse Poisson log probability 1003
- Support `total_count=0` in Multinomial distribution 1000
- Add a flag to control regularize mass matrix behavior in mass matrix adaptation 998
- Add experimental Dockerfiles 996
- Allow setting max tree depth of NUTS sampler during warmup phase 984
- Fix dimensions mixed up in `ExpandedDistribution.sample` method 972
- MCMC objects can be pickled now 968
This release is made of great contributions and feedbacks from the Pyro community: ahoho, kpj, gustavehug, AndrewCSQ, jatentaki, tcbegley, dominikstrb, justinrporter, dirmeier, irustandi, MarcoGorelli, lumip, and many others. Thank you!