Edward

Latest version: v1.3.5

Safety actively analyzes 629599 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 5

1.0.9

- There is now one data object to rule them all: a Python dictionary. (see 156)
- Distribution objects can be of arbitrary shape. For example, a 5 x 2 matrix of Normal random variables is declared with `x = Normal([5, 2])`. (see 138)

Documentation
- All of Edward is documented. (see 148)
- Edward now follows [TensorFlow style guidelines](https://www.tensorflow.org/versions/r0.9/how_tos/style_guide.html).
- A tutorial on black box variational inference is available. (see 153)

Miscellaneous
- We now use the special functions and their automatic differentation available in TensorFlow, e.g., `tf.lgamma`, `tf.digamma`, `tf.lbeta`.
- Sampling via NumPy/SciPy is done using a `tf.py_func` wrapper, speeding up sampling and avoiding internal overhead from the previous solution. (see 160)
- Sampling via reparameterizable distributions now follows the convention of `tf.contrib.distributions`. (see 161)
- Fixed bug where a class copy of the `layers` object in `Variational` is done (see 119)

1.0.8

- distributions can now be specified with parameters, simplifying use of inference networks, alternative parameterizations, and much of the internals for developing new inference algorithms; see 126
- TensorFlow session is now a global variable and can simply be accessed with `get_session()`; see 117
- added Laplace approximation
- added utility function to calculate hessian of TensorFlow tensors

1.0.7

- hotfix to get `from . import models` working

1.0.6

- website with revamped documentation: http://edwardlib.org. See details in 108
- criticism of probabilistic models with `ed.evaluate()` and `ed.ppc()`. See details in 107

1.0.5

- enabled Keras as neural network specification
- samples in variational model can now leverage TensorFlow-based
samplers and not only SciPy-based samplers
- let user optionally specify `sess` when using `inference`
- mean-field variational inference can now take advantage of analytically tractable KL terms for standard normal priors
- data can additionally be a list of `np.ndarray`s or list of
`tf.placeholder`s
- added mixture density network as example
- enabled dimensions of distribution output to match with input dimensions
- renamed `log_gamma`, `log_beta`, `multivariate_log_beta` to `lgamma`
and `lbeta` to follow convention in TensorFlow API
- let `PointMass` be a variational factor
- fixed `Multinomial` variational factor
- added continuous integration for unit tests

1.0.4

- interface-wise, you now import models (probability models or variational models) using


from edward.models import PythonModel, Variational, Normal


By default you can also do something like `ed.StanModel(model_file=model_file)`.
- variational distributions now default to initializing with only one factor

Page 4 of 5

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.