Textgenrnn

Latest version: v2.0.0

Safety actively analyzes 619494 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

1.5.0

Two major features:

Synthesis (beta)

Generate text using two (or more!) trained models simultaneously. See [this notebook](https://github.com/minimaxir/textgenrnn/blob/master/docs/textgenrnn-synthesize.ipynb) for a demo.

The results are messier than usual so a lower `temperature` is recommended. It should work on both char-level and word-level models, or a mix of both. (however, I do not recommending mixing line-delimited and full text models!)

Please file issues if there are errors!

Generate Progress Bar

Thanks to `tqdm`, all `generate` functions show a progress bar! You can override this by passing `progress=False` to the function.

Additionally, the default generate temperature is now `[1.0, 0.5, 0.2, 0.2]`!

1.4.1

1.4

Features

* Interactive mode, which lets you control which text is added. (52, thanks Juanets !)
* Allow backends other than TensorFlow (44, thanks torokati44 !)
* Allow periodic weights saving (37, thanks IrekRybark !)
* Multi-GPU support (beta: see 62 )

Fixes

* Handle `prefix` in word-level models correctly.

1.3.2

Emergency bug fix to address 57 which occurs in newer Keras versions.

1.3.1

* Added ability to cycle temperatures during training (see [this notebook](https://github.com/minimaxir/textgenrnn/blob/master/docs/textgenrnn-temp-cycle.ipynb) for more information
* Added utf-8 encoding for vocab export.
* Added alias for `train_on_texts(new_model=True)` to `train_new_model`.
* Fixed an issue where specifying `dropout` could cause issues.

1.3

* Added `encode_text_vectors` to encode text using the trained network.
* Added `similarity` to quickly calculate cosine similarity and return the most similar texts.

See [this notebook](https://github.com/minimaxir/textgenrnn/blob/master/docs/textgenrnn-encode-text.ipynb) for details.

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.