Adapter-transformers

Latest version: v3.2.1.post0

Safety actively analyzes 627248 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 2

4.5.1

All major new features & changes are described at https://docs.adapterhub.ml/v2_transition.
- all changes merged via 105

Additional changes & Fixes
- Support loading adapters with load_best_model_at_end in Trainer (calpt via 122)
- Add setter for active_adapters property (calpt via 132)
- New notebooks for NER, text generation & AdapterDrop (hSterz via 135)
- Enable trainer to load adapters from checkpoints (hSterz via 138)
- Update & clean up example scripts (hSterz via 154 & calpt via 141, 155)
- Add unfreeze_adapters param to train_fusion() (calpt via 156)
- Ensure eval/ train mode is correct for AdapterFusion (calpt via 157)

adapters1.1.1

3.5.1

New
- New model with adapter support: DistilBERT (calpt via 67)
- Save label->id mapping of the task together with the adapter prediction head (hSterz via 75)
- Automatically set matching label->id mapping together with active prediction head (hSterz via 81)
- Upgraded underlying transformers version (calpt via 55, 72 and 85)
- Colab notebook tutorials showcasing all AdapterHub concepts (calpt via 89)

Fixed
- Support for models with flexible heads in pipelines (calpt via 80)
- Adapt input to models with flexible heads to static prediction heads input (calpt via 90)


adapters1.0.1

2.11.0

- our first release 🎉

0.1.0

**Blog post: https://adapterhub.ml/blog/2023/11/introducing-adapters/**

With the new _Adapters_ library, we fundamentally refactored the adapter-transformers library and added support for new models and adapter methods.

This version is compatible with Hugging Face Transformers version 4.35.2.

For a guide on how to migrate from adapter-transformers to _Adapters_ have a look at https://docs.adapterhub.ml/transitioning.md.
Changes are given compared to the latest [adapters-transformers v3.2.1](https://github.com/adapter-hub/adapters/releases/tag/adapters3.2.1).

New Models & Adapter Methods
- Add LLaMA model integration (hSterz)
- Add X-MOD model integration (calpt via 581)
- Add Electra model integration (hSterz via 583, based on work of amitkumarj441 and pauli31 in 400)
- Add adapter output & parameter averaging (calpt)
- Add Prompt Tuning (lenglaender and calpt via 595)
- Add Composition Support to LoRA and (IA)³ (calpt via 598)

Breaking Changes
- Renamed bottleneck adapter configs and config strings. The new names can be found here: https://docs.adapterhub.ml/overview.html (calpt)
- Removed the XModelWithHeads classes (lenglaender) _(XModelWithHeads have been deprecated since adapter-transformers version 3.0.0)_

Changes Due to the Refactoring
- Refactored the implementation of all already supported models (calpt, lenglaender, hSterz, TimoImhof)
- Separate the model config (`PretrainedConfig`) from the adapters config (`ModelAdaptersConfig`) (calpt)
- Updated the whole documentation, Jupyter Notebooks and example scripts (hSterz, lenglaender, TimoImhof, calpt)
- Introduced the `load_model` function to load models containing adapters. This replaces the Hugging Face `from_pretrained` function used in the `adapter-transformers` library (lenglaender)
- Sharing more logic for adapter composition between different composition blocks (calpt via 591)
- Added Backwards Compatibility Tests which allow for testing if adaptations of the codebase, such as Refactoring, impair the functionality of the library (TimoImhof via 596)
- Refactored the EncoderDecoderModel by introducing a new mixin (`ModelUsingSubmodelsAdaptersMixin`) for models that contain other models (lenglaender)
- Rename the class `AdapterConfigBase` into `AdapterConfig` (hSterz via 603)

Fixes and Minor Improvements
- Fixed EncoderDecoderModel generate function (lenglaender)
- Fixed deletion of invertible adapters (TimoImhof)
- Automatically convert heads when loading with XAdapterModel (calpt via 594)
- Fix training T5 adapter models with Trainer (calpt via 599)
- Ensure output embeddings are frozen during adapter training (calpt 537)


adapters3.2.1

Page 2 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.