Sockeye

Latest version: v3.1.34

Safety actively analyzes 629004 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 33 of 45

1.18.25

Changed
- Update requirements to use MKL versions of MXNet for fast CPU operation.

1.18.24

Added
- Dockerfiles and convenience scripts for running `fast_align` to generate lexical tables.
These tables can be used to create top-K lexicons for faster decoding via vocabulary selection ([documentation](https://github.com/awslabs/sockeye/tree/master/contrib/fast_align)).

Changed
- Updated default top-K lexicon size from 20 to 200.

1.18.23

Not secure
Fixed
- Correctly create the convolutional embedding layers when the encoder is set to `transformer-with-conv-embed`. Previously
no convolutional layers were added so that a standard Transformer model was trained instead.

1.18.22

Not secure
Fixed
- Make sure the default bucket is large enough with word based batching when the source is longer than the target (Previously
there was an edge case where the memory usage was sub-optimal with word based batching and longer source than target sentences).

1.18.21

Not secure
Fixed
- Constrained decoding was missing a crucial cast
- Fixed test cases that should have caught this

1.18.20

Not secure
Changed
- Transformer parametrization flags (model size, of attention heads, feed-forward layer size) can now optionally
defined separately for encoder & decoder. For example, to use a different transformer model size for the encoder,
pass `--transformer-model-size 1024:512`.

Page 33 of 45

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.