Keanu

Latest version: v0.0.26.dev1

Safety actively analyzes 629359 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 4

0.0.20

Common
* Saving a network as a DOT file includes labels on constant vertices.

Python
* Improved performance of getting samples by using byte streams.
* Added Python docstrings for sampling

0.0.19

* Added `get_vertex_by_label` to `BayesNet`
* Added optional param `label` for all vertices in Python (e.g. `Gaussian(0., 1., label="gaussian")`). Now you must label vertices you are sampling from, otherwise Keanu will throw an exception.
* Improved the structure of the Python sample dictionary
* Scalar networks
* If all sampled vertices are scalar then you are returned a dictionary keyed on vertex ID with the values containing a list of primitive values.
* An example of a network of size 2 with three samples: `{"gaussian": [0., 1., 2.], "exponential": [0.5, 1.5, 3.5]}`
* `samples["gaussian"] = [0., 1., 2.]`
* Tensor networks
* If any of the sampled vertices are tensors then you are returned a dictionary keyed on a tuple of vertex ID and tensor index.
* This makes it much easier to extract all the samples for a given index and for creating a multi-index dataframe
* An example of a scalar exponential and a `2x2` tensor gaussian with three samples:

exp gaussian
(0) (0, 0) (0, 1) (1, 0) (1, 1)
0 4.231334 5.017627 5.734130 3.904472 9.909033
1 4.231334 5.017627 5.734130 3.904472 9.909033
2 4.676046 4.308018 5.035550 6.240894 10.112683

* `samples[("gaussian", (0, 1))] = [5.734130, 5.734130, 5.035550] `
* As a result, whenever keying into a sample dictionary, you are guaranteed to receive a list of primitives.
* This greatly reduces the complexity of the `autocorrelation` and `traceplot` API's as they now simply expect a list of values.
* Added Adam optimizer
* GradientOptimizer and NonGradientOptimizer now takes an algorithm argument that by default will use Conjugate Gradient and BOBYQA respectively.
* GradientOptimizer and NonGradientOptimizer return a OptimizedResult object instead of just the optimized fitness as a double
* Reorganised the factory methods for building `PosteriorSamplingAlgorithm` objects. The following are available and give you access to either a default implementation or, when you need more control over the configuration, a Builder object:
* `Keanu.Sampling.MetropolisHastings`
* `Keanu.Sampling.NUTS`
* `Keanu.Sampling.SimulatedAnnealing`
* `Keanu.Sampling.MCMC` (to automatically choose between `MetropolisHastings` and `NUTS`)
* The `PosteriorSamplingAlgorithm` objects are now immutable: you cannot set their values after construction.
* When you choose a custom configuration of `MetropolisHastings` or `SimulatedAnnealing`, you can specify:
* the proposal distribution. Options are `PriorProposalDistribution` and `GaussianProposalDistribution`
* a proposal rejection strategy - options are `RollBackToCachedValuesOnRejection` and `RollbackAndCascadeOnRejection`
* (in Python, this is done for you)
* `Hamiltonian` Monte Carlo has been removed: use `NUTS` instead which is an auto-tuning implementation of Hamiltonian Monte Carlo.
* The `PosteriorSamplingAlgorithm` and `FitnessFunction` interfaces requires a `ProbabilisticModel` as its argument instead of a `BayesianNetwork`
* You can create a `ProbabilisticModel` from a `BayesianNetwork`:
* For `MetropolisHastings`: `new KeanuProbabilisticModel(bayesNet)`
* For `NUTS`: `new KeanuProbabilisticModelWithGradient(bayesNet)`
* The `ProposalDistribution` interface uses `Variable` instead of `Vertex`
* A `Variable` is an abstraction that does not assume any graph-like structure of your model.
* `KeanuRandom` has been moved to package `io.improbable.keanu`
* The `ProposalListener` interface has been changed: `onProposalApplied` is now called `onProposalCreated`.
* This is because the `Proposal` interface no longer has `apply` and `reject` methods.
* Reversed the ordering of namespace for `VertexLabel` (e.g. from `new VertexLabel("keanu", "improbable", "io")` to `new VertexLabel("io", "improbable", "keanu"))
* Added better Python operator overloading support.
- Includes automatically casting to the correct datatype where possible.
* Added Geometric Distribution vertex
* Fixed issue with certain vertices not taking a list properly in Kotlin
* Added `__version__` attribute to Python
* Added a permute vertex
* Added the MIRSaver/MIRLoader + the proposed MIR proto format
* Added the release notes text file to the repo

0.0.18

- Vertices
-- `AssertVertex` for validating vertex values as part of the graph
-- `PrintVertex` for logging vertex values as they propagate through the graph
-- `IntegerIfVertex`
-- `IntegerCPTVertex`
-- There is now consistent naming of `Boolean` across the codebase (No more `Bool`).

- Debugging
-- `Triangular` and `Beta` now correctly perform sample bound checks
-- You can search vertex labels by namespace inside a `BayesianNetwork`

- Python
-- Type casting vertices
-- NUTS parameters are now settable (Tuning of step size, max tree height, target acceptance prob)
-- Save/Load APIs now available from Python
-- Metropolis Hastings "use cache on reject" now settable in Python
-- You should now be able to create more vertex types (in particular those that take anything that isn't a Vertex)

- Miscellaneous Improvements
-- Upgraded Gradle

- Non-Backwards compatible API changes:
-- Any type/Vertex/operation involving the word "Bool" has been renamed to Boolean
-- The Python sample() API has changed.
Previously the algorithm was specified as a string via the "algo" parameter to the sample() function.
The sample function now takes an instance of a PosteriorSamplingAlgorithm object (either MetropolisHastingsSampler, HamiltonianSampler or NUTSSampler) as the sampling_algorithm parameter. See test_sampling.py for more details.

0.0.17

Save/Load
* You can now save and load models in JSON format (in addition to Protobuf and DOT)
* Tensor shape information is included when you save a model to Protobuf or JSON format
* You can save metadata (arbitrary strings) in Protobuf, JSON or DOT format

Metrics and Debugging:
* Graph metrics: vertex count, average vertex degree
* Calculate the autocorrelation of your samples
* Real-time plotting of sample values in Python
* Track proposal acceptance rate in Metropolis Hastings
* Java trace logging (SLF4J) of Metropolis Hastings and Simulated Annealing

Algorithms
* Hierarchical Linear Regression
* More bug fixes for the NUTS algorithm

Python API:
* Support for Lambda Model vertex
-- You can now use any black box method or process as a vertex in your Bayes Net

0.0.16

- Tensors
-- Allow creation of rank 0 and rank 1 tensors

- MCMC
-- Fix numerical instability in NUTS
-- Improvements to NUTS
-- Unified MCMC algorithm to choose the most appropriate algorithm for you based on your `BayesianNetwork`

- Python
-- Python annotations
-- Casting of parameters to their expected types
-- Streamable samples
-- Vertices and tensors are compatible with `pandas dataframes, series and numpy arrays`

- Higher Level Modelling APIs
-- You can now `fit` your model via sampling
-- Higher level modelling API's now take data of shape: `(n_samples, n_features)`

- Debugging Tools
-- Graph metrics on `BayesianNetwork`

- Save / Load
-- Save a graph and its values to `google protobuf`
-- Load a graph from a saved `google protobuf` file
-- Graph to `dot` file

0.0.15

-Higher level APIs
-- Linear, Logistic, Lasso and Ridge regression builders
-- Refactor the `ParticleFilter`

-Python
-- Operator overloading (Also supports numpy arrays)
-- Gradient & Non-Gradient Optimizer
-- Now builds as part of the CI pipeline

-Docs
-- Docs are now live on `github pages`: https://improbable-research.github.io/keanu/

-Bugs
-- Fixed an issue where `dropping` zero samples would throw a `divide by zero exception`

Page 2 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.