Changelogs » Transformer-embedder
PyUp Safety actively tracks 295,363 Python packages for vulnerabilities and notifies you when to upgrade.
Transformer-embedder
1.4.5
Fix sentence length bug
1.4.4
Fixed bug with double-`SEP` models
1.4.3
Return dictionary has `word_mask` parameter, in the size of the original sentence (before sub-tokens)
1.4.2
Bug fixes
1.4.1
Expose transformer `hidden_size` parameter
1.4
Wrapper around `Tokenizer` output. Lot of bug fixes.
1.3.2
Update `Tokenizer` parameters, fixed batch bug.
1.3.1
Fix import bug
1.3
Added SpaCy tokenizer. Set `use_spacy=True` to use it. By default, uses a multilingual model. It can be changed during `Tokenizer` init with `language` parameter.
1.1
Add `split_on_space` flag to tokenizer, update README.md, fix some bugs.
1.0
First release.