Opentsne

Latest version: v1.0.1

Safety actively analyzes 621142 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

1.0.1

Changes

- setup.py maintenance (249)
- drop Python 3.6 support (249)
- correctly implement dof parameter in exact BH implementation (246)

1.0.0

Given the longtime stability of openTSNE, it is only fitting that we release a v1.0.0.

Changes

- Various documentation fixes involving initialization, momentum, and learning rate (243)
- Include Python 3.11 in the test and build matrix
- Uniform affinity kernel now supports `mean` and `max` mode (242)

0.7.1

Bug Fixes

- (urgent) Fix memory error on data with duplicated rows (236)

0.7.0

Changes
- By default, we now add jitter to non-random initialization schemes. This has almost no effect on the resulting visualizations, but helps avoid potential problems when points are initialized at identical positions (225)
- By default, the learning rate is now calculated as `N/exaggeration`. This speeds up convergence of the resulting embedding. Note that the learning rate during the EE phase will differ from the learning rate during the standard phase. Additionally, we set `momentum=0.8` in both phases. Before, it was 0.5 during EE and 0.8 during the standard phase. This, again, speeds up convergence. (220)
- Add `PrecomputedAffinities` to wrap square affinity matrices (217)

Build changes
- Build `universal2` macos wheels enabling ARM support (226)

Bug Fixes
- Fix BH collapse for smaller data sets (235)
- Fix `updates` in optimizer not being stored correctly between optimization calls (229)
- Fix `inplace=True` optimization changing the initializations themselves in some rare use-cases (225)

As usual, a special thanks to dkobak for helping with practically all of these bugs/changes.

0.6.2

Changes
- By default, we now use the `MultiscaleMixture` affinity model, enabling us to pass in a list of perplexities instead of a single perplexity value. This is fully backwards compatible.
- Previously, perplexity values would be changed according to the dataset. E.g. we pass in `perplexity=100` with N=150. Then `TSNE.perplexity` would be equal to 50. Instead, keep this value as is and add an `effective_perplexity_` attribute (following the convention from scikit-learn, which puts in the corrected perplexity values.
- Fix bug where interpolation grid was being prepared even when using BH optimization during transform.
- Enable calling `.transform` with precomputed distances. In this case, the data matrix will be assumed to be a distance matrix.

Build changes
- Build with `oldest-supported-numpy`
- Build linux wheels on `manylinux2014` instead of `manylinux2010`, following numpy's example
- Build MacOS wheels on `macOS-10.15` instead of `macos-10.14` Azure VM
- Fix potential problem with clang-13, which actually does optimization with infinities using the `-ffast-math` flag

0.6.0

Changes:
- Remove `affinites` from `TSNE` construction, allow custom affinities and initialization in `.fit` method. This improves the API when dealing with non-tabular data. This is not backwards compatible.
- Add `metric="precomputed"`. This includes the addition of `openTSNE.nearest_neighbors.PrecomputedDistanceMatrix` and `openTSNE.nearest_neighbors.PrecomputedNeighbors`.
- Add `knn_index` parameter to `openTSNE.affinity` classes.
- Add (less-than-ideal) workaround for pickling Annoy objects.
- Extend the range of recommended FFTW boxes up to 1000.
- Remove deprecated `openTSNE.nearest_neighbors.BallTree`.
- Remove deprecated `openTSNE.callbacks.ErrorLogger`.
- Remove deprecated `TSNE.neighbors_method` property.
- Add and set as default `negative_gradient_method="auto"`.

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.