Changed - Optimization: Decoder class is now a complete HybridBlock (no forward method).
2.3.14
Not secure
Changed - Updated to [MXNet 1.8.0](https://github.com/apache/incubator-mxnet/tree/1.8.0) - Removed dependency support for Cuda 9.2 (no longer supported by MXNet 1.8). - Added dependency support for Cuda 11.0 and 11.2. - Updated Python requirement to 3.7 and later. (Removed backporting `dataclasses` requirement)
2.3.13
Added - Target factors are now also collected for nbest translations (and stored in the JSON output handler).
2.3.12
Added - Added `--config` option to `prepare_data` CLI to allow setting commandline flags via a yaml config. - Flags for the `prepare_data` CLI are now stored in the output folder under `args.yaml` (equivalent to the behavior of `sockeye_train`)
2.3.11
Added - Added option `prevent_unk` to avoid generating `<unk>` token in beam search.
2.3.10
Not secure
Changed
- Make sure that the top N best params files retained, even if N > --keep-last-params. This ensures that model averaging will not be crippled when keeping only a few params files during training. This can result in a significant savings of disk space during training.