Gpytorch

Latest version: v1.11

Safety actively analyzes 628969 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 6

1.4.0

This release includes many major speed improvements, especially to Kronecker-factorized multi-output models.

Performance improvements
- Major speed improvements for Kronecker product multitask models (1355, 1430, 1440, 1469, 1477)
- Unwhitened VI speed improvements (1487)
- SGPR speed improvements (1493)
- Large scale exact GP speed improvements (1495)
- Random Fourier feature speed improvements (1446, 1493)

New Features
- Dirichlet Classification likelihood (1484) - based on Milios et al. (NeurIPS 2018)
- MultivariateNormal objects have a `base_sample_shape` attribute for low-rank/degenerate distributions (1502)

New documentation
- Tutorial for designing your own kernels (1421)

Debugging utilities
- Better naming conventions for AdditiveKernel and ProductKernel (1488)
- `gpytorch.settings.verbose_linalg` context manager for seeing what linalg routines are run (1489)
- Unit test improvements (1430, 1437)

Bug Fixes
- `inverse_transform` is applied to the initial values of constraints (1482)
- `psd_safe_cholesky` obeys cholesky_jitter settings (1476)
- fix scaling issue with priors on variational models (1485)

Breaking changes
- `MultitaskGaussianLikelihoodKronecker` (deprecated) is fully incorporated in `MultitaskGaussianLikelihood` (1471)

1.3.1

Fixes
- Spectral mixture kernels work with SKI (1392)
- Natural gradient descent is compatible with batch-mode GPs (1416)
- Fix prior mean in whitened SVGP (1427)
- RBFKernelGrad has no more in-place operations (1389)
- Fixes to ConstantDiagLazyTensor (1381, 1385)

Documentation
- Include example notebook for multitask Deep GPs (1410)
- Documentation updates (1408, 1434, 1385, 1393)

Performance
- KroneckerProductLazyTensors use root decompositions of children (1394)
- SGPR now uses Woodbury formula and matrix determinant lemma (1356)

Other
- Delta distributions have an `arg_constraints` attribute (1422)
- Cholesky factorization now takes optional diagonal noise argument (1377)

1.3.0

This release primarily focuses on performance improvements, and adds contour integral quadrature based variational models.

Major Features

Variational models with contour integral quadrature
- Add an MVM-based approach to whitened variatiational inference (1372)
- This is based on the work in [Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization](https://arxiv.org/abs/2006.11267)

Minor Features

Performance improvements
- Kronecker product models compute a deterministic logdet (faster than the Lanczos-based logdet) (1332)
- Improve efficiency of `KroneckerProductLazyTensor` symeig method (1338)
- Improve SGPR efficiency (1356)

Other improvements
- `SpectralMixtureKernel` accepts arbitrary batch shapes (1350)
- Variational models pass around arbitrary `**kwargs` to the `forward` method (1339)
- `gpytorch.settings` context managers keep track of their default value (1347)
- Kernel objects can be pickle-d (1336)

Bug Fixes
- Fix `requires_grad` checks in `gpytorch.inv_matmul` (1322)
- Fix reshaping bug for batch independent multi-output GPs (1368)
- `ZeroMean` accepts a `batch_shape` argument (1371)
- Various doc fixes/improvements (1327, 1343, 1315, 1373)

1.2.1

This release includes the following fixes:

- Fix caching issues with variational GPs (1274, 1311)
- Ensure that constraint bounds are properly cast to floating point types (1307)
- Fix bug with broadcasting multitask multivariate normal shapes (1312)
- Bypass KeOps for small/rectangular kernels (1319)
- Fix issues with `eigenvectors=False` in LazyTensorsymeig (1283)
- Fix issues with fixed-noise LazyTensor preconditioner (1299)
- Doc fixes (1275, 1301)

1.2.0

Major Features

New variational and approximate models
This release features a number of new and added features for approximate GP models:

- Linear model of coregionalization for variational multitask GPs (1180)
- Deep Sigma Point Process models (1193)
- Mean-field decoupled (MFD) models from "Parametric Gaussian Process Regressors" (Jankowiak et al., 2020) (1179)
- Implement natural gradient descent (1258)
- Additional non-conjugate likelihoods (Beta, StudentT, Laplace) (1211)

New kernels
We have just added a number of new specialty kernels:

- `gpytorch.kernels.GaussianSymmetrizedKLKernel` for performing regression with uncertain inputs (1186)
- `gpytorch.kernels.RFFKernel` (random Fourier features kernel) (1172, 1233)
- `gpytorch.kernels.SpectralDeltaKernel` (a parametric kernel for patterns/extrapolation) (1231)

More scalable sampling
- Large-scale sampling with contour integral quadrature from Pleiss et al., 2020 (1194)

Minor features
- Ability to set amount of jitter added when performing Cholesky factorizations (1136)
- Improve scalability of KroneckerProductLazyTensor (1199, 1208)
- Improve speed of preconditioner (1224)
- Add symeig and svd methods to LazyTensors (1105)
- Add TriangularLazyTensor for Cholesky methods (1102)

Bug fixes
- Fix initialization code for `gpytorch.kernels.SpectralMixtureKernel` (1171)
- Fix bugs with LazyTensor addition (1174)
- Fix issue with loading smoothed box priors (1195)
- Throw warning when variances are not positive, check for valid correlation matrices (1237, 1241, 1245)
- Fix sampling issues with Pyro integration (1238)

1.1.1

Major features

- GPyTorch is compatible with PyTorch 1.5 (latest release)
- Several bugs with task-independent multitask models are fixed (1110)
- Task-dependent multitask models are more batch-mode compatible (1087, 1089, 1095)

Minor features

- `gpytorch.priors.MultivariateNormalPrior` has an expand method (1018)
- Better broadcasting for batched inducing point models (1047)
- `LazyTensor` repeating works with rectangular matrices (1068)
- `gpytorch.kernels.ScaleKernel` inherits the `active_dims` property from its base kernel (1072)
- Fully-bayesian models can be saved (1076)

Bug Fixes

- `gpytorch.kernels.PeriodicKernel` is batch-mode compatible (1012)
- Fix `gpytorch.priors.MultivariateNormalPrior` expand method (1018)
- Fix indexing issues with `LazyTensors` (1029)
- Fix constants with `gpytorch.mlls.GammaRobustVariationalELBO` (1038, 1053)
- Prevent doubly-computing derivatives of kernel inputs (1042)
- Fix initialization issues with `gpytorch.kernels.SpectralMixtureKernel` (1052)
- Fix stability of `gpytorch.variational.DeltaVariationalStrategy`

Page 3 of 6

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.