Botorch

Latest version: v0.11.0

Safety actively analyzes 630169 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 6 of 8

0.3.1

Maintenance Release

New Features
* Constrained Multi-Objective tutorial (493)
* Multi-fidelity Knowledge Gradient tutorial (509)
* Support for batch qMC sampling (510)
* New `evaluate` method for `qKnowledgeGradient` (515)

Compatibility
* Require PyTorch >=1.6 (535)
* Require GPyTorch >=1.2 (535)
* Remove deprecated `botorch.gen module` (532)

Bug fixes
* Fix bad backward-indexing of task_feature in `MultiTaskGP` (485)
* Fix bounds in constrained Branin-Currin test function (491)
* Fix max_hv for C2DTLZ2 and make Hypervolume always return a float (494)
* Fix bug in `draw_sobol_samples` that did not use the proper effective dimension (505)
* Fix constraints for `q>1` in `qExpectedHypervolumeImprovement` (c80c4fdb0f83f0e4f12e4ec4090d0478b1a8b532)
* Only use feasible observations in partitioning for `qExpectedHypervolumeImprovement`
in `get_acquisition_function` (523)
* Improved GPU compatibility for `PairwiseGP` (537)

Performance Improvements
* Reduce memory footprint in `qExpectedHypervolumeImprovement` (522)
* Add `(q)ExpectedHypervolumeImprovement` to nonnegative functions
[for better initialization] (496)

Other changes
* Support batched `best_f` in `qExpectedImprovement` (487)
* Allow to return full tree of solutions in `OneShotAcquisitionFunction` (488)
* Added `construct_inputs` class method to models to programmatically construct the
inputs to the constructor from a standardized `TrainingData` representation
(477, 482, 3621198d02195b723195b043e86738cd5c3b8e40)
* Acquisition function constructors now accept catch-all `**kwargs` options
(478, e5b69352954bb10df19a59efe9221a72932bfe6c)
* Use `psd_safe_cholesky` in `qMaxValueEntropy` for better numerical stabilty (518)
* Added `WeightedMCMultiOutputObjective` (81d91fd2e115774e561c8282b724457233b6d49f)
* Add ability to specify `outcomes` to all multi-output objectives (524)
* Return optimization output in `info_dict` for `fit_gpytorch_scipy` (534)
* Use `setuptools_scm` for versioning (539)

0.3.0

Multi-Objective Bayesian Optimization

New Features
* Multi-Objective Acquisition Functions (466)
* q-Expected Hypervolume Improvement
* q-ParEGO
* Analytic Expected Hypervolume Improvement with auto-differentiation
* Multi-Objective Utilities (466)
* Pareto Computation
* Hypervolume Calculation
* Box Decomposition algorithm
* Multi-Objective Test Functions (466)
* Suite of synthetic test functions for multi-objective, constrained optimization
* Multi-Objective Tutorial (468)
* Abstract ConstrainedBaseTestProblem (454)
* Add optimize_acqf_list method for sequentially, greedily optimizing 1 candidate
from each provided acquisition function (d10aec911b241b208c59c192beb9e4d572a092cd)

Bug fixes
* Fixed re-arranging mean in MultiTask MO models (450).

Other changes
* Move gpt_posterior_settings into models.utils (449)
* Allow specifications of batch dims to collapse in samplers (457)
* Remove outcome transform before model-fitting for sequential model fitting
in MO models (458)

0.2.5

Bugfix Release

Bug fixes
* Fixed issue with broken wheel build (444).

Other changes
* Changed code style to use absolute imports throughout (443).

0.2.4

Bugfix Release

Bug fixes
* There was a mysterious issue with the 0.2.3 wheel on pypi, where part of the
`botorch/optim/utils.py` file was not included, which resulted in an `ImportError` for
many central components of the code. Interestingly, the source dist (built with the
same command) did not have this issue.
* Preserve order in ChainedOutcomeTransform (440).

New Features
* Utilities for estimating the feasible volume under outcome constraints (437).

0.2.3

Pairwise GP for Preference Learning, Sampling Strategies.

Compatibility
* Require PyTorch >=1.5 (423).
* Require GPyTorch >=1.1.1 (425).

New Features
* Add `PairwiseGP` for preference learning with pair-wise comparison data (388).
* Add `SamplingStrategy` abstraction for sampling-based generation strategies, including
`MaxPosteriorSampling` (i.e. Thompson Sampling) and `BoltzmannSampling` (218, 407).

Deprecations
* The existing `botorch.gen` module is moved to `botorch.generation.gen` and imports
from `botorch.gen` will raise a warning (an error in the next release) (218).

Bug fixes
* Fix & update a number of tutorials (394, 398, 393, 399, 403).
* Fix CUDA tests (404).
* Fix sobol maxdim limitation in `prune_baseline` (419).

Other changes
* Better stopping criteria for stochastic optimization (392).
* Improve numerical stability of `LinearTruncatedFidelityKernel` (409).
* Allow batched `best_f` in `qExpectedImprovement` and `qProbabilityOfImprovement`
(411).
* Introduce new logger framework (412).
* Faster indexing in some situations (414).
* More generic `BaseTestProblem` (9e604fe2188ac85294c143d249872415c4d95823).

0.2.2

Require PyTorch 1.4, Python 3.7 and new features for active learning,
multi-fidelity optimization, and a number of bug fixes.

Compatibility
* Require PyTorch >=1.4 (379).
* Require Python >=3.7 (378).

New Features
* Add `qNegIntegratedPosteriorVariance` for Bayesian active learning (377).
* Add `FixedNoiseMultiFidelityGP`, analogous to `SingleTaskMultiFidelityGP` (386).
* Support `scalarize_posterior` for m>1 and q>1 posteriors (374).
* Support `subset_output` method on multi-fidelity models (372).
* Add utilities for sampling from simplex and hypersphere (369).

Bug fixes
* Fix `TestLoader` local test discovery (376).
* Fix batch-list conversion of `SingleTaskMultiFidelityGP` (370).
* Validate tensor args before checking input scaling for more
informative error messaages (368).
* Fix flaky `qNoisyExpectedImprovement` test (362).
* Fix test function in closed-loop tutorial (360).
* Fix num_output attribute in BoTorch/Ax tutorial (355).

Other changes
* Require output dimension in `MultiTaskGP` (383).
* Update code of conduct (380).
* Remove deprecated `joint_optimize` and `sequential_optimize` (363).

Page 6 of 8

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.