Pytorch-metric-learning

Latest version: v2.5.0

Safety actively analyzes 630566 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 9

1.7.1

Features and bug fixes

- Added SumReducer
- Fixed bug where labels were always required for DistributedLossWrapper
- Removed torchvision as a dependency

1.7.0

Bug fixes

Fixes an edge case in ArcFaceLoss. Thanks ElisonSherton!

Relevant links:

- https://github.com/KevinMusgrave/pytorch-metric-learning/issues/537
- https://github.com/KevinMusgrave/pytorch-metric-learning/pull/539
- https://github.com/deepinsight/insightface/issues/2126
- https://github.com/ronghuaiyang/arcface-pytorch/issues/48

1.6.3

Bug Fixes

- Fixed bug where `DistributedMinerWrapper` would crash when `world_size == 1` (542)

1.6.2

1.6.1

Bug Fixes

Fixed a bug in `mean_average_precision` in `AccuracyCalculator`. Previously, the divisor for each sample was the number of correctly retrieved samples. In the new version, the divisor for each sample is `min(k, num_relevant)`.

For example, if class "A" has 11 samples, then `num_relevant` is 11 for every sample with the label "A".
- If `k = 5`, meaning that 5 nearest neighbors are retrieved for each sample, then the divisor will be 5.
- If `k = 100`, meaning that 100 nearest neighbors are retrieved for each sample, then the divisor will be 11.

The bug in previous versions did _not_ affect `mean_average_precision_at_r`.

Other minor changes

Added additional shape checks to `AccuracyCalculator.get_accuracy`.

1.6.0

Features

`DistributedLossWrapper` and `DistributedMinerWrapper` now support `ref_emb` and `ref_labels`:

python
from pytorch_metric_learning import losses
from pytorch_metric_learning.utils import distributed as pml_dist

loss_func = losses.ContrastiveLoss()
loss_func = pml_dist.DistributedLossWrapper(loss_func)

loss = loss_func(embeddings, labels, ref_emb=ref_emb, ref_labels=ref_labels)


Thanks NoTody for PR 503

Page 3 of 9

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.