Botorch

Latest version: v0.11.0

Safety actively analyzes 630169 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 8

0.6.0

Compatibility
* Require PyTorch >=1.9 (1011).
* Require GPyTorch >=1.6 (1011).

New Features
* New `ApproximateGPyTorchModel` wrapper for various (variational) approximate GP models (1012).
* New `SingleTaskVariationalGP` stochastic variational Gaussian Process model (1012).
* Support for Multi-Output Risk Measures (906, 965).
* Introduce `ModelList` and `PosteriorList` (829).
* New Constraint Active Search tutorial (1010).
* Add additional multi-objective optimization test problems (958).

Other Changes
* Add `covar_module` as an optional input of `MultiTaskGP` models (941).
* Add `min_range` argument to `Normalize` transform to prevent division by zero (931).
* Add initialization heuristic for acquisition function optimization that samples around best points (987).
* Update initialization heuristic to perturb a subset of the dimensions of the best points if the dimension is > 20 (988).
* Modify `apply_constraints` utility to work with multi-output objectives (994).
* Short-cut `t_batch_mode_transform` decorator on non-tensor inputs (991).

Performance Improvements
* Use lazy covariance matrix in `BatchedMultiOutputGPyTorchModel.posterior` (976).
* Fast low-rank Cholesky updates for `qNoisyExpectedHypervolumeImprovement` (747, 995, 996).

Bug Fixes
* Update error handling to new PyTorch linear algebra messages (940).
* Avoid test failures on Ampere devices (944).
* Fixes to the `Griewank` test function (972).
* Handle empty base_sample_shape in `Posterior.rsample` (986).
* Handle `NotPSDError` and hitting `maxiter` in `fit_gpytorch_model` (1007).
* Use TransformedPosterior for subclasses of GPyTorchPosterior (983).
* Propagate `best_f` argument to `qProbabilityOfImprovement` in input constructors (f5a5f8b6dc20413e67c6234e31783ac340797a8d).

0.5.1

Compatibility
* Require GPyTorch >=1.5.1 (928).

New Features
* Add `HigherOrderGP` composite Bayesian Optimization tutorial notebook (864).
* Add Multi-Task Bayesian Optimziation tutorial (867).
* New multi-objective test problems from (876).
* Add `PenalizedMCObjective` and `L1PenaltyObjective` (913).
* Add a `ProximalAcquisitionFunction` for regularizing new candidates towards previously generated ones (919, 924).
* Add a `Power` outcome transform (925).

Bug Fixes
* Batch mode fix for `HigherOrderGP` initialization (856).
* Improve `CategoricalKernel` precision (857).
* Fix an issue with `qMultiFidelityKnowledgeGradient.evaluate` (858).
* Fix an issue with transforms with `HigherOrderGP`. (889)
* Fix initial candidate generation when parameter constraints are on different device (897).
* Fix bad in-place op in `_generate_unfixed_lin_constraints` (901).
* Fix an input transform bug in `fantasize` call (902).
* Fix outcome transform bug in `batched_to_model_list` (917).

Other Changes
* Make variance optional for `TransformedPosterior.mean` (855).
* Support transforms in `DeterministicModel` (869).
* Support `batch_shape` in `RandomFourierFeatures` (877).
* Add a `maximize` flag to `PosteriorMean` (881).
* Ignore categorical dimensions when validating training inputs in `MixedSingleTaskGP` (882).
* Refactor `HigherOrderGPPosterior` for memory efficiency (883).
* Support negative weights for minimization objectives in `get_chebyshev_scalarization` (884).
* Move `train_inputs` transforms to `model.train/eval` calls (894).

0.5.0

Compatibility
* Require PyTorch >=1.8.1 (832).
* Require GPyTorch >=1.5 (848).
* Changes to how input transforms are applied: `transform_inputs` is applied in `model.forward` if the model is in `train` mode, otherwise it is applied in the `posterior` call (819, 835).

New Features
* Improved multi-objective optimization capabilities:
* `qNoisyExpectedHypervolumeImprovement` acquisition function that improves on `qExpectedHypervolumeImprovement` in terms of tolerating observation noise and speeding up computation for large `q`-batches (797, 822).
* `qMultiObjectiveMaxValueEntropy` acqusition function (913aa0e510dde10568c2b4b911124cdd626f6905, 760).
* Heuristic for reference point selection (830).
* `FastNondominatedPartitioning` for Hypervolume computations (699).
* `DominatedPartitioning` for partitioning the dominated space (726).
* `BoxDecompositionList` for handling box decompositions of varying sizes (712).
* Direct, batched dominated partitioning for the two-outcome case (739).
* `get_default_partitioning_alpha` utility providing heuristic for selecting approximation level for partitioning algorithms (793).
* New method for computing Pareto Frontiers with less memory overhead (842, 846).
* New `qLowerBoundMaxValueEntropy` acquisition function (a.k.a. GIBBON), a lightweight variant of Multi-fidelity Max-Value Entropy Search using a Determinantal Point Process approximation (724, 737, 749).
* Support for discrete and mixed input domains:
* `CategoricalKernel` for categorical inputs (771).
* `MixedSingleTaskGP` for mixed search spaces (containing both categorical and ordinal parameters) (772, 847).
* `optimize_acqf_discrete` for optimizing acquisition functions over fully discrete domains (777).
* Extend `optimize_acqf_mixed` to allow batch optimization (804).
* Support for robust / risk-aware optimization:
* Risk measures for robust / risk-averse optimization (821).
* `AppendFeatures` transform (820).
* `InputPerturbation` input transform for for risk averse BO with implementation errors (827).
* Tutorial notebook for Bayesian Optimization of risk measures (823).
* Tutorial notebook for risk-averse Bayesian Optimization under input perturbations (828).
* More scalable multi-task modeling and sampling:
* `KroneckerMultiTaskGP` model for efficient multi-task modeling for block-design settings (all tasks observed at all inputs) (637).
* Support for transforms in Multi-Task GP models (681).
* Posterior sampling based on Matheron's rule for Multi-Task GP models (841).
* Various changes to simplify and streamline integration with Ax:
* Handle non-block designs in `TrainingData` (794).
* Acquisition function input constructor registry (788, 802, 845).
* Random Fourier Feature (RFF) utilties for fast (approximate) GP function sampling (750).
* `DelaunayPolytopeSampler` for fast uniform sampling from (simple) polytopes (741).
* Add `evaluate` method to `ScalarizedObjective` (795).

Bug Fixes
* Handle the case when all features are fixed in `optimize_acqf` (770).
* Pass `fixed_features` to initial candidate generation functions (806).
* Handle batch empty pareto frontier in `FastPartitioning` (740).
* Handle empty pareto set in `is_non_dominated` (743).
* Handle edge case of no or a single observation in `get_chebyshev_scalarization` (762).
* Fix an issue in `gen_candidates_torch` that caused problems with acqusition functions using fantasy models (766).
* Fix `HigherOrderGP` `dtype` bug (728).
* Normalize before clamping in `Warp` input warping transform (722).
* Fix bug in GP sampling (764).

Other Changes
* Modify input transforms to support one-to-many transforms (819, 835).
* Make initial conditions for acquisition function optimization honor parameter constraints (752).
* Perform optimization only over unfixed features if `fixed_features` is passed (839).
* Refactor Max Value Entropy Search Methods (734).
* Use Linear Algebra functions from the `torch.linalg` module (735).
* Use PyTorch's `Kumaraswamy` distribution (746).
* Improved capabilities and some bugfixes for batched models (723, 767).
* Pass `callback` argument to `scipy.optim.minimize` in `gen_candidates_scipy` (744).
* Modify behavior of `X_pending` in in multi-objective acqusiition functions (747).
* Allow multi-dimensional batch shapes in test functions (757).
* Utility for converting batched multi-output models into batched single-output models (759).
* Explicitly raise `NotPSDError` in `_scipy_objective_and_grad` (787).
* Make `raw_samples` optional if `batch_initial_conditions` is passed (801).
* Use powers of 2 in qMC docstrings & examples (812).

0.4.0

Compatibility
* Require PyTorch >=1.7.1 (714).
* Require GPyTorch >=1.4 (714).

New Features
* `HigherOrderGP` - High-Order Gaussian Process (HOGP) model for
high-dimensional output regression (631, 646, 648, 680).
* `qMultiStepLookahead` acquisition function for general look-ahead
optimization approaches (611, 659).
* `ScalarizedPosteriorMean` and `project_to_sample_points` for more
advanced MFKG functionality (645).
* Large-scale Thompson sampling tutorial (654, 713).
* Tutorial for optimizing mixed continuous/discrete domains (application
to multi-fidelity KG with discrete fidelities) (716).
* `GPDraw` utility for sampling from (exact) GP priors (655).
* Add `X` as optional arg to call signature of `MCAcqusitionObjective` (487).
* `OSY` synthetic test problem (679).

Bug Fixes
* Fix matrix multiplication in `scalarize_posterior` (638).
* Set `X_pending` in `get_acquisition_function` in `qEHVI` (662).
* Make contextual kernel device-aware (666).
* Do not use an `MCSampler` in `MaxPosteriorSampling` (701).
* Add ability to subset outcome transforms (711).

Performance Improvements
* Batchify box decomposition for 2d case (642).

Other Changes
* Use scipy distribution in MES quantile bisect (633).
* Use new closure definition for GPyTorch priors (634).
* Allow enabling of approximate root decomposition in `posterior` calls (652).
* Support for upcoming 21201-dimensional PyTorch `SobolEngine` (672, 674).
* Refactored various MOO utilities to allow future additions (656, 657, 658, 661).
* Support input_transform in PairwiseGP (632).
* Output shape checks for t_batch_mode_transform (577).
* Check for NaN in `gen_candidates_scipy` (688).
* Introduce `base_sample_shape` property to `Posterior` objects (718).

0.3.3

Contextual Bayesian Optimization, Input Warping, TuRBO, sampling from polytopes.

Compatibility
* Require PyTorch >=1.7 (614).
* Require GPyTorch >=1.3 (614).

New Features
* Models (LCE-A, LCE-M and SAC ) for Contextual Bayesian Optimziation (581).
* Implements core models from:
[High-Dimensional Contextual Policy Search with Unknown Context Rewards using Bayesian Optimization](https://proceedings.neurips.cc/paper/2020/hash/faff959d885ec0ecf70741a846c34d1d-Abstract.html).
Q. Feng, B. Letham, H. Mao, E. Bakshy. NeurIPS 2020.
* See Ax for usage of these models.
* Hit and run sampler for uniform sampling from a polytope (592).
* Input warping:
* Core functionality (607).
* Kumaraswamy Distribution (606).
* Tutorial (8f34871652042219c57b799669a679aab5eed7e3).
* TuRBO-1 tutorial (598).
* Implements the method from [Scalable Global Optimization via
Local Bayesian Optimization](https://proceedings.neurips.cc/paper/2019/file/6c990b7aca7bc7058f5e98ea909e924b-Paper.pdf).
D. Eriksson, M. Pearce, J. Gardner, R. D. Turner, M. Poloczek. NeurIPS 2019.

Bug fixes
* Fix bounds of `HolderTable` synthetic function (596).
* Fix `device` issue in MOO tutorial (621).

Other changes
* Add `train_inputs` option to `qMaxValueEntropy` (593).
* Enable gpytorch settings to override BoTorch defaults for `fast_pred_var` and `debug` (595).
* Rename `set_train_data_transform` -> `preprocess_transform` (575).
* Modify `_expand_bounds()` shape checks to work with >2-dim bounds (604).
* Add `batch_shape` property to models (588).
* Modify `qMultiFidelityKnowledgeGradient.evaluate()` to work with `project`, `expand` and `cost_aware_utility` (594).
* Add list of papers using BoTorch to website docs (617).

0.3.2

Maintenance Release

New Features
* Add `PenalizedAcquisitionFunction` wrapper (585)
* Input transforms
* Reversible input transform (550)
* Rounding input transform (562)
* Log input transform (563)
* Differentiable approximate rounding for integers (561)

Bug fixes
* Fix sign error in UCB when `maximize=False` (a4bfacbfb2109d3b89107d171d2101e1995822bb)
* Fix batch_range sample shape logic (574)

Other changes
* Better support for two stage sampling in preference learning
(0cd13d0cb49b1ac8d0971e42f1f0e9dd6126fd9a)
* Remove noise term in `PairwiseGP` and add `ScaleKernel` by default (571)
* Rename `prior` to `task_covar_prior` in `MultiTaskGP` and `FixedNoiseMultiTaskGP`
(8e42ea82856b165a7df9db2a9b6f43ebd7328fc4)
* Support only transforming inputs on training or evaluation (551)
* Add `equals` method for `InputTransform` (552)

Page 5 of 8

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.