Llamacpp

Latest version: v0.1.14

Safety actively analyzes 621521 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

0.1.14

* Fixes 19
* Updates llama.cpp submodule

0.1.13

* Adds support for "infinite text generation" using context swapping (similar to the `main` example in llama.cpp)

0.1.12

* Makes unit tests more consistent and usable (still not running in workflows as the weights are too large)
* Updates llama.cpp submodule

0.1.11

* Breaking change but makes model loading practically instantaneous thanks to memory-mapped I/O
* Requires re-generating the weight files using the new convert script (or use the migration script from llama.cpp)

0.1.10

* Adds back `get_tokenizer()` and `add_bos()` that were broken in previous release

0.1.9

* Updates the bindings to work with the new llama.cpp API from https://github.com/ggerganov/llama.cpp/pull/370
* Adds two separate interfaces - `LlamaInference` which is similar to the bindings in v0.1.8 and the lower level `LlamaContext` (currently untested)
* The old bindings are still present in `PyLlama.cpp` but is currently not compiled and will be removed at a later date

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.