Nevergrad

Latest version: v1.0.2

Safety actively analyzes 625786 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 4

0.5.0

Breaking changes

- `copy()` method of a `Parameter` does not change the parameters's random state anymore (it used to reset it to `None` [1048](https://github.com/facebookresearch/nevergrad/pull/1048)
- `MultiobjectiveFunction` does not exist anymore [1034](https://github.com/facebookresearch/nevergrad/pull/1034).
- `Choice` and `TransitionChoice` have some of their API changed for uniformization. In particular, `indices` is now an
`ng.p.Array` (and not an `np.ndarray`) which contains the selected indices (or index) of the `Choice`. The sampling is
performed by specific "layers" that are applied to `Data` parameters [1065](https://github.com/facebookresearch/nevergrad/pull/1065).
- `Parameter.set_standardized_space` does not take a `deterministic` parameter anymore
[1068](https://github.com/facebookresearch/nevergrad/pull/1068). This is replaced by the more
general `with ng.p.helpers.determistic_sampling(parameter)` context. One-shot algorithms are also updated to choose
options of `Choice` parameters deterministically, since it is a simpler behavior to expect compared to sampling the
standardized space than sampling the option stochastically from there
- `RandomSearch` now defaults to sample values using the `parameter.sample()` instead of a Gaussian
[1068](https://github.com/facebookresearch/nevergrad/pull/1068). The only difference comes with bounded
variables since in this case `parameter.sample()` samples uniformly (unless otherwise specified).
The previous behavior can be obtained with `RandomSearchMaker(sampler="gaussian")`.
- `PSO` API has been slightly changed [1073](https://github.com/facebookresearch/nevergrad/pull/1073)
- `Parameter` instances `descriptor` attribute is deprecated, in favor of a combinaison of an analysis function
(`ng.p.helpers.analyze`) returning information about the parameter (eg: whether continuous, deterministic etc...)
and a new `function` attribute which can be used to provide information about the function (eg: whether deterministic etc)
[1076](https://github.com/facebookresearch/nevergrad/pull/1076).
- Half the budget alloted to solve cheap constrained is now used by a sub-optimizer
[1047](https://github.com/facebookresearch/nevergrad/pull/1047). More changes of constraint management will land
in the near future.
- Experimental methods `Array.set_recombination` and `Array.set_mutation(custom=.)` are removed in favor of
layers changing `Array` behaviors [1086](https://github.com/facebookresearch/nevergrad/pull/1086).
Caution: this is still very experimental (and undocumented).
- Important bug correction on the shape of bounds if specified as tuple or list instead of np.ndarray
[1221](https://github.com/facebookresearch/nevergrad/pull/1221).

Important changes

- `master` branch has been renamed to `main`. See [1230](https://github.com/facebookresearch/nevergrad/pull/1230) for more context.
- `Parameter` classes are undergoing heavy changes, please open an issue if you encounter any problem.
The midterm aim is to allow for simpler constraint management.
- `Parameter` have been updated have undergone heavy changes to ease the handling of their tree structure (
[1029](https://github.com/facebookresearch/nevergrad/pull/1029)
[1036](https://github.com/facebookresearch/nevergrad/pull/1036)
[1038](https://github.com/facebookresearch/nevergrad/pull/1038)
[1043](https://github.com/facebookresearch/nevergrad/pull/1043)
[1044](https://github.com/facebookresearch/nevergrad/pull/1044)
)
- `Parameter` classes have now a layer structure [1045](https://github.com/facebookresearch/nevergrad/pull/1045)
which simplifies changing their behavior. In future PRs this system will take charge of bounds, other constraints,
sampling etc.
- The layer structures allows disentangling bounds and log-distribution. This goal has been reached with
[1053](https://github.com/facebookresearch/nevergrad/pull/1053) but may create some instabilities. In particular,
the representation (`__repr__`) of `Array` has changed, and their `bounds` attribute is no longer reliable for now.
This change will eventually lead to a new syntax for settings bounds and distribution, but it's not ready yet.
- `DE` initial sampling as been updated to take bounds into accounts [1058](https://github.com/facebookresearch/nevergrad/pull/1058)
- `Array` can now take `lower` and `upper` bounds as initialization arguments. The array is initialized at its average
if not `init` is provided and both bounds are provided. In this case, sampling will be uniformm between these bounds.
- Bayesian optimizers are now properly using the bounds for bounded problem, which may improve performance
[1222](https://github.com/facebookresearch/nevergrad/pull/1222).


Other changes

- the new `nevergrad.errors` module gathers errors and warnings used throughout the package (WIP) [1031](https://github.com/facebookresearch/nevergrad/pull/1031).
- `EvolutionStrategy` now defaults to NSGA2 selection in the multiobjective case
- A new experimental callback adds an early stopping mechanism
[1054](https://github.com/facebookresearch/nevergrad/pull/1054).
- `Choice`-like parameters now accept integers are inputs instead of a list, as a shortcut for `range(num)`
[1106](https://github.com/facebookresearch/nevergrad/pull/1106).
- An interface with [Pymoo](https://pymoo.org/) optimizers has been added
[1197](https://github.com/facebookresearch/nevergrad/pull/1197).
- An interface with [BayesOptim](https://github.com/wangronin/Bayesian-Optimization) optimizers has been added
[1179](https://github.com/facebookresearch/nevergrad/pull/1179).
- Fix for abnormally slow iterations for large budgets using CMA in a portfolio
[1350](https://github.com/facebookresearch/nevergrad/pull/1350).
- A new `enable_pickling` option was added to optimizers. This is only necessary for some of them (among which `scipy`-based optimizer), and comes at the cost of additional memory usage
[1356](https://github.com/facebookresearch/nevergrad/pull/1356)
[1358](https://github.com/facebookresearch/nevergrad/pull/1358).

0.4.3

Important changes

- `tell` method can now receive a list/array of losses for multi-objective optimization [775](https://github.com/facebookresearch/nevergrad/pull/775). For now it is neither robust, nor scalable, nor stable, nor optimal so be careful when using it. More information in the [documentation](https://facebookresearch.github.io/nevergrad/optimization.html#multiobjective-minimization-with-nevergrad).
- The old way to perform multiobjective optimization, through the use of :code:`MultiobjectiveFunction`, is now deprecated and will be removed after version 0.4.3 [1017](https://github.com/facebookresearch/nevergrad/pull/1017).
- By default, the optimizer now returns the best set of parameter as recommendation [951](https://github.com/facebookresearch/nevergrad/pull/951), considering that the function is deterministic. The previous behavior would use an estimation of noise to provide the pessimistic best point, leading to unexpected behaviors [#947](https://github.com/facebookresearch/nevergrad/pull/947). You can can back to this behavior by specifying: :code:`parametrization.descriptors.deterministic_function = False`

Other

- `DE` and its variants have been updated to make full use of the multi-objective losses [789](https://github.com/facebookresearch/nevergrad/pull/789). Other optimizers convert multiobjective problems to a volume minimization, which is not always as efficient.
- as an **experimental** feature we have added some preliminary support for constraint management through penalties.
From then on the prefered option for penalty is to register a function returning a positive float when the constraint is satisfied.
While we will wait fore more testing before documenting it, this may already cause instabilities and errors when adding cheap constraints.
Please open an issue if you encounter a problem.
- `tell` argument `value` is renamed to `loss` for clarification [774](https://github.com/facebookresearch/nevergrad/pull/774). This can be breaking when using named arguments!
- `ExperimentFunction` now automatically records arguments used for their instantiation so that they can both be used to create a new copy, and as descriptors if there are of type int/bool/float/str [914](https://github.com/facebookresearch/nevergrad/pull/914 [#914](https://github.com/facebookresearch/nevergrad/pull/914)).
- from now on, code formatting needs to be [`black`](https://black.readthedocs.io/en/stable/) compliant. This is
simply performed by running `black nevergrad`. A continuous integration checks that PRs are compliant, and the
precommit hooks have been adapted. For PRs branching from an old master, you can run `black --line-length=110 nevergrad/<path_to_modified_file>` to make your code easier to merge.
- Pruning has been patched to make sure it is not activated too often upon convergence [1014](https://github.com/facebookresearch/nevergrad/pull/1014). The bug used to lead to important slowdown when reaching near convergence.

0.4.2

- `recommend` now provides an evaluated candidate when possible. For non-deterministic parametrization like `Choice`, this means we won't resample, and we will actually recommend the best past evaluated candidate [668](https://github.com/facebookresearch/nevergrad/pull/668). Still, some optimizers (like `TBPSA`) may recommend a non-evaluated point.
- `Choice` and `TransitionChoice` can now take a `repetitions` parameters for sampling several times, it is equivalent to :code:`Tuple(*[Choice(options) for _ in range(repetitions)])` but can be be up to 30x faster for large numbers of repetitions [670](https://github.com/facebookresearch/nevergrad/pull/670) [#696](https://github.com/facebookresearch/nevergrad/pull/696).
- Defaults for bounds in `Array` is now `bouncing`, which is a variant of `clipping` avoiding over-sompling on the bounds [684](https://github.com/facebookresearch/nevergrad/pull/684) and [#691](https://github.com/facebookresearch/nevergrad/pull/691).

This version should be robust. Following versions may become more unstable as we will add more native multiobjective optimization as an **experimental** feature. We also are in the process of simplifying the naming pattern for the "NGO/Shiwa" type optimizers which may cause some changes in the future.

0.4.1

- `Archive` now stores the best corresponding candidate. This requires twice the memory compared to before the change. [594](https://github.com/facebookresearch/nevergrad/pull/594)
- `Parameter` now holds a `loss: Optional[float]` attribute which is set and used by optimizers after the `tell` method.
- Quasi-random samplers (`LHSSearch`, `HammersleySearch`, `HaltonSearch` etc...) now sample in the full range of bounded variables when the `full_range_sampling` is `True` [598](https://github.com/facebookresearch/nevergrad/pull/598). This required some ugly hacks, help is most welcome to find nices solutions.
- `full_range_sampling` is activated by default if both range are provided in `Array.set_bounds`.
- Propagate parametrization system features (generation tracking, ...) to `OnePlusOne` based algorithms [599](https://github.com/facebookresearch/nevergrad/pull/599).
- Moved the `Selector` dataframe overlay so that basic requirements do not include `pandas` (only necessary for benchmarks) [609](https://github.com/facebookresearch/nevergrad/pull/609)
- Changed the version name pattern (removed the `v`) to unify with `pypi` versions. Expect more frequent intermediary versions to be pushed (deployment has now been made pseudo-automatic).
- Started implementing more ML-oriented testbeds [642](https://github.com/facebookresearch/nevergrad/pull/642)

0.4.0

Breaking and important changes

- Removed all deprecated code [499](https://github.com/facebookresearch/nevergrad/pull/499). That includes:
- `instrumentation` as init parameter of an `Optimizer` (replaced by `parametrization`)
- `instrumentation` as attribute of an `Optimizer` (replaced by `parametrization`)
- `candidate_maker` (not needed anymore)
- `optimize` methods of `Optimizer` (renamed to `minimize`)
- all the `instrumentation` subpackage (replaced by `parametrization`) and its legacy methods (`set_cheap_constraint_checker` etc)
- Removed `ParametrizedOptimizer` and `OptimizerFamily` in favor of `ConfiguredOptimizer` with simpler usage [518](https://github.com/facebookresearch/nevergrad/pull/518) [#521](https://github.com/facebookresearch/nevergrad/pull/521).
- Some variants of algorithms have been removed from the `ng.optimizers` namespace to simplify it. All such variants can be easily created
using the corresponding `ConfiguredOptimizer`. Also, adding `import nevergrad.optimization.experimentalvariants` will populate `ng.optimizers.registry`
with all variants, and they are all available for benchmarks [528](https://github.com/facebookresearch/nevergrad/pull/528).
- Renamed `a_min` and `a_max` in `Array`, `Scalar` and `Log` parameters for clarity.
Using old names will raise a deprecation warning for the time being.
- `archive` is pruned much more often (eg.: for `num_workers=1`, usually pruned to 100 elements when reaching 1000),
so you should not rely on it for storing all results, use a callback instead [571](https://github.com/facebookresearch/nevergrad/pull/571).
If this is a problem for you, let us know why and we'll find a solution!

Other changes

- Propagate parametrization system features (generation tracking, ...) to `TBPSA`, `PSO` and `EDA` based algorithms.
- Rewrote multiobjective core system [484](https://github.com/facebookresearch/nevergrad/pull/484).
- Activated Windows CI (still a bit flaky, with a few deactivated tests).
- Better callbacks in `np.callbacks`, including exporting to [`hiplot`](https://github.com/facebookresearch/hiplot).
- Activated [documentation](https://facebookresearch.github.io/nevergrad/) on github pages.
- Scalar now takes optional `lower` and `upper` bounds at initialization, and `sigma` (and optionnally `init`)
if is automatically set to a sensible default [536](https://github.com/facebookresearch/nevergrad/pull/536).

0.3.2

Breaking changes (possibly for next version)

- Fist argument of optimizers is renamed to `parametrization` instead of `instrumentation` for consistency [497](https://github.com/facebookresearch/nevergrad/pull/497). There is currently a deprecation warning, but this will be breaking in v0.4.0.
- Old `instrumentation` classes now raise deprecation warnings, and will disappear in versions >0.3.2.
Hence, prefere using parameters from `ng.p` than `ng.var`, and avoid using `ng.Instrumentation` altogether if
you don't need it anymore (or import it through `ng.p.Instrumentation`).
- `CandidateMaker` (`optimizer.create_candidate`) raises `DeprecationWarning`s since it new candidates/parameters
can be straightforwardly created (`parameter.spawn_child(new_value=new_value)`)
- `Candidate` class is completely removed, and is completely replaced by `Parameter` [459](https://github.com/facebookresearch/nevergrad/pull/459).
This should not break existing code since `Parameter` can be straightforwardly used as a `Candidate`.

Other changes

- New parametrization is now as efficient as in v0.3.0 (see CHANGELOG for v0.3.1 for contect)
- Optimizers can now hold any parametrization, not just `Instrumentation`. This for instance mean that when you
do `OptimizerClass(instrumentation=12, budget=100)`, the instrumentation (and therefore the candidates) will be of class
`ng.p.Array` (and not `ng.p.Instrumentation`), and their attribute `value` will be the corresponding `np.ndarray` value.
You can still use `args` and `kwargs` if you want, but it's no more needed!
- Added *experimental* evolution-strategy-like algorithms using new parametrization [471](https://github.com/facebookresearch/nevergrad/pull/471)
(the behavior and API of these optimizers will probably evolve in the near future).
- `DE` algorithms comply with the new parametrization system and can be set to use parameter's recombination.
- Fixed array as bounds in `Array` parameters

Page 2 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.