Aleph-alpha-client

Latest version: v7.1.0

Safety actively analyzes 630169 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 11

3.1.1

Bug Fixes

`PromptGranularity` for `ExplanationRequest`s is now an enum. This was previously just a type alias
for a union of literals but we felt that it would be more natural to have a dedicated enum.

3.1.0

Features

New `.explain()` method 🎉

Better understand the source of a completion, specifically on how much each section of a prompt impacts the completion.

To get started, you can simply pass in a prompt you used with a model and the completion the model gave and generate an explanation:

python
from aleph_alpha_client import Client, CompletionRequest, ExplanationRequest, Prompt

client = Client(token=os.environ["AA_TOKEN"])
prompt = Prompt.from_text("An apple a day, ")
model_name = "luminous-extended"

create a completion request
request = CompletionRequest(prompt=prompt, maximum_tokens=32)
response = client.complete(request, model=model_name)

generate an explanation
request = ExplanationRequest(prompt=prompt, target=response.completions[0].completion)
response = client.explain(request, model=model_name)


To visually see the results, you can also use this in our [Playground](https://app.aleph-alpha.com/playground/explanation).

We also have more [documentation and examples](https://docs.aleph-alpha.com/docs/tasks/explain/) available for you to read.

AtMan (Attention Manipulation)

Under the hood, we are leveraging the method from our [AtMan paper](https://arxiv.org/abs/2301.08110) to help generate these explanations. And we've also exposed these controls anywhere you can submit us a prompt!

So if you have other use cases for attention manipulation, you can pass these AtMan controls as part of your prompt items.

python
from aleph_alpha_client import Prompt, Text, TextControl

Prompt([
Text("Hello, World!", controls=[TextControl(start=0, length=5, factor=0.5)]),
Image.from_url(
"https://cdn-images-1.medium.com/max/1200/1*HunNdlTmoPj8EKpl-jqvBA.png",
controls=[ImageControl(top=0.25, left=0.25, height=0.5, width=0.5, factor=2.0)]
)
])


For more information, check out our [documentation and examples](https://docs.aleph-alpha.com/docs/explainability/attention-manipulation/).

3.0.0

Breaking Changes

- Removed deprecated `AlephAlphaClient` and `AlephAlphaModel`. Use `Client` or `AsyncClient` instead.
- Removed deprecated `ImagePrompt`. Import `Image` instead for image prompt items.
- New Q&A interface. We've improved the Q&A implementation, and most parameters are no longer needed.
- You only need to specify your documents, a query, and (optional) the max number of answers you want to receive.
- You no longer specify a model.
- Removed "model" parameter from summarize method
- Removed "model_version" from `SummarizationResponse`

2.17.0

Features

- Allow specifying token overlap behavior in AtMan by benbrandt in https://github.com/Aleph-Alpha/aleph-alpha-client/pull/106

Bug Fixes

- Better handle case when Prompt is supplied a string instead of a list by benbrandt in https://github.com/Aleph-Alpha/aleph-alpha-client/pull/107

Experimental

- New Explain interface for internal testing by ahartel and benbrandt in https://github.com/Aleph-Alpha/aleph-alpha-client/pull/97 https://github.com/Aleph-Alpha/aleph-alpha-client/pull/98 https://github.com/Aleph-Alpha/aleph-alpha-client/pull/99 https://github.com/Aleph-Alpha/aleph-alpha-client/pull/100 https://github.com/Aleph-Alpha/aleph-alpha-client/pull/101 https://github.com/Aleph-Alpha/aleph-alpha-client/pull/102 https://github.com/Aleph-Alpha/aleph-alpha-client/pull/103 https://github.com/Aleph-Alpha/aleph-alpha-client/pull/104

2.16.1

- AsyncClient now respects http proxy env variables, as Client did all the time
- Update examples links in Readme.md

2.16.0

- Add Image.from_image_source

Page 5 of 11

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.