Gpflow

Latest version: v2.9.1

Safety actively analyzes 630169 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 7 of 9

1.4.1

Features, bugfixes and improvements:
===

* Minor performance improvement: pass cholesky to the gauss_kl (933)
* Documentation revamp:
* Organize documentation and notebooks (924)
* GPflow glossary document (923)
* Fix AutoBuild metaclass for method signatures (907)
* CircleCI instead of TravisCI (892)
* Broadcasting conditionals via `tf.broadcast_to` (895)
* Support TensorFlow >= 1.12 (877)

1.3.0

Main features and improvements
---------------------------------

* Cleaning up stationary kernel implementations: now defined in terms of `K_r` or `K_r2`. (827)
* Support broadcasting over arbitrarily many leading dimensions for kernels and `conditional`. (829)
* Analytic expectation of the cross-covariance between different RBF kernels. (754)
* New MixedKernelSeparateMof feature class for multi-output GPs. (830)
* The `sample_conditional` returns mean and var as well as samples, and can generate more than one sample. (836)
* Support monitoring with `ScipyOptimizer`. (856)

Bug fixes
---------

* Fix bug in ndiag_mc for multi-dimensional kwargs. (813)
* Fix parameter.trainable to be a property. (814)
* Remove references to six module. (816)
* Fix `tf.control_dependencies` in likelihoods. (821)
* Fix `active_dims` for slice type. (840)

1.2.0

Main features and improvements
---------------------------------

- Added `SoftMax` likelihood (799)
- Added likelihoods where expectations are evaluated with Monte Carlo, `MonteCarloLikelihood` (799)
- GPflow monitor refactoring, check `monitor-tensorboard.ipynb` for details (792)
- Speedup testing on Travis using utility functions for configuration in notebooks (789)
- Support Python 3.5.2 in typing checks (Ubuntu 16.04 default python3) (787)
- Corrected scaling in Students-t likelihood variance (777)
- Removed jitter before taking the cholesky of the covariance in NatGrad optimizer (768)
- Added GPflow logger. Created option for setting logger level in `gpflowrc` (764)
- Improved quadrature for likelihoods. Unified quadrature method introduced - `ndiagquad` (736), (747)
- Added support for multi-output GPs, check `multioutput.ipynb` for details (724)
* Multi-output features
* Multi-output kernels
* Multi-dispatch for conditional
* Multi-dispatch for Kuu and Kuf
- Support Exponential distribution as prior (717)
- Added notebook to demonstrate advanced usage of GPflow, such as combining GP with Neural Network (712)
- Minibatch shape is `None` by default to allow dynamic change of data size (704)
- Epsilon parameter of the Robustmax likelihood is trainable now (635)
- GPflow model saver (660)
* Supports native GPflow models and provides an interface for defining custom savers for user's models
* Saver stores GPflow structures and pythonic types as numpy structured arrays and serializes them using HDF5

Bug fixes
---------

- Fixed bug at `params_as_tensors_for` (751)
- Fixed GPflow SciPy optimizer to pass options to _actual_ scipy optimizer correctly (738)

1.1.1

* pytest is a part of the dependency list

1.1.0

Main features and improvements

* Actions framework - new approach for running optimization. It provides full control over optimization schedule.
* Natural gradient optimizer.
* New expectations framework.
* Analytic kernel expectations for the RBF, Linear and Sum kernels.
* Randomness in default GPflow naming.
* Pretty pandas printing for GPflow models and parameters.
* `backward_tensor` methods for transformations.
* `gauss_kl` works with `K` matrices of shape `L x M x M`.
* `params_as_tensors_for` can accept multiple parameterized objects.

Bug fixes

* Adam like optimizers work with TensorFlow 1.5 and higher.
* The free vector for the LowerTriangular transform wasn't being handled consistently. The LowerTriangular transform no longer expects 'free form’ variable to have a flat shape.
* LowerTriangular transformation squeezed all dimensions.
* Correct PCA implementation.
* Gradient of `sqrt(RBF(Z, Z))` is no longer NaN.

1.0.0

---

Main features and improvements

* Clear and aligned with tree-like structure of GPflow models design.
* GPflow trainable parameters are no longer packed into one TensorFlow variable.
* Integration of bare TensorFlow and Keras models with GPflow became very simple.
* GPflow parameter wraps multiple tensors: unconstained variable, constrained tensor and prior tensor.
* Instantaneous parameter's building into the TensorFlow graph. Once you created an instance of parameter, it creates necessary tensors at default graph immediately.
* New implementation for AutoFlow. `autoflow` decorator is a replacement.
* GPflow optimizers match TensorFlow optimizer names. For e.g. `gpflow.train.GradientDescentOptimizer` mimics `tf.train.GradientDescentOptimizer`. They even has the same instantialization signature.
* GPflow has native support for Scipy optimizers - `gpflow.train.ScipyOptimizer`.
* GPflow has advanced HMC implementation - `gpflow.train.HMC`. It works only within TensorFlow memory scope.
* Tensor conversion decorator and context manager designed for cases when user needs to implicitly convert parameters to TensorFlow tensors: `gpflow.params_as_tensors` and `gpflow.params_as_tensors_for`.
* GPflow parameters and parameterized objects provide convenient methods and properties for building, intializing their tensors. Check `initializables`, `initializable_feeds`, `feeds` and other properties and methods.
* Floating shapes of parameters and dataholders without re-building TensorFlow graph.


Note

_This release is not backward compatible with previous versions._

Page 7 of 9

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.