anthropic-bedrock


Nameanthropic-bedrock JSON
Version 0.8.0 PyPI version JSON
download
home_page
SummaryThe official Python library for the anthropic-bedrock API
upload_time2024-01-31 00:22:56
maintainer
docs_urlNone
author
requires_python>=3.7
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Anthropic Bedrock Python API library

[![PyPI version](https://img.shields.io/pypi/v/anthropic-bedrock.svg)](https://pypi.org/project/anthropic-bedrock/)

The Anthropic Bedrock Python library provides convenient access to the [Anthropic Bedrock](https://docs.anthropic.com/claude/docs/claude-on-amazon-bedrock) REST API from any Python 3.7+
application. It includes type definitions for all request params and response fields,
and offers both synchronous and asynchronous clients powered by [httpx](https://github.com/encode/httpx).

For the non-Bedrock Anthropic API at api.anthropic.com, see [`anthropic-python`](https://github.com/anthropics/anthropic-sdk-python).

## Documentation

The REST API documentation can be found [on docs.anthropic.com](https://docs.anthropic.com/claude/reference/). The full API of this library can be found in [api.md](https://www.github.com/anthropics/anthropic-bedrock-python/blob/main/api.md).

## Installation

```sh
pip install anthropic-bedrock
```

## Usage

The full API of this library can be found in [api.md](https://www.github.com/anthropics/anthropic-bedrock-python/blob/main/api.md).

```python
import anthropic_bedrock
from anthropic_bedrock import AnthropicBedrock

client = AnthropicBedrock(
    # Authenticate by either providing the keys below or use the default AWS credential providers, such as
    # using ~/.aws/credentials or the "AWS_SECRET_ACCESS_KEY" and "AWS_ACCESS_KEY_ID" environment variables.
    aws_access_key="<access key>",
    aws_secret_key="<secret key>",
    # Temporary credentials can be used with aws_session_token.
    # Read more at https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp.html.
    aws_session_token="<session_token>",
    # aws_region changes the aws region to which the request is made. By default, we read AWS_REGION,
    # and if that's not present, we default to us-east-1. Note that we do not read ~/.aws/config for the region.
    aws_region="us-east-2",
)

completion = client.completions.create(
    model="anthropic.claude-v2:1",
    max_tokens_to_sample=256,
    prompt=f"{anthropic_bedrock.HUMAN_PROMPT} how does a court case get to the Supreme Court? {anthropic_bedrock.AI_PROMPT}",
)
print(completion.completion)
```

> This library uses [botocore](https://github.com/boto/botocore) internally for authentication; you can read more about the default providers [here](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html).

## Async usage

Simply import `AsyncAnthropicBedrock` instead of `AnthropicBedrock` and use `await` with each API call:

```python
import anthropic_bedrock
from anthropic_bedrock import AsyncAnthropicBedrock

client = AsyncAnthropicBedrock()


async def main():
    completion = await client.completions.create(
        model="anthropic.claude-v2:1",
        max_tokens_to_sample=256,
        prompt=f"{anthropic_bedrock.HUMAN_PROMPT} how does a court case get to the Supreme Court? {anthropic_bedrock.AI_PROMPT}",
    )
    print(completion.completion)


asyncio.run(main())
```

Functionality between the synchronous and asynchronous clients is otherwise identical.

## Streaming Responses

We provide support for streaming responses using Server Side Events (SSE).

```python
from anthropic_bedrock import AnthropicBedrock, HUMAN_PROMPT, AI_PROMPT

client = AnthropicBedrock()

stream = client.completions.create(
    prompt=f"{HUMAN_PROMPT} Your prompt here{AI_PROMPT}",
    max_tokens_to_sample=300,
    model="anthropic.claude-v2:1",
    stream=True,
)
for completion in stream:
    print(completion.completion, end="", flush=True)
```

The async client uses the exact same interface.

```python
from anthropic_bedrock import AsyncAnthropicBedrock, HUMAN_PROMPT, AI_PROMPT

client = AsyncAnthropicBedrock()

stream = await client.completions.create(
    prompt=f"{HUMAN_PROMPT} Your prompt here{AI_PROMPT}",
    max_tokens_to_sample=300,
    model="anthropic.claude-v2:1",
    stream=True,
)
async for completion in stream:
    print(completion.completion, end="", flush=True)
```

## Using types

Nested request parameters are [TypedDicts](https://docs.python.org/3/library/typing.html#typing.TypedDict). Responses are [Pydantic models](https://docs.pydantic.dev), which provide helper methods for things like:

- Serializing back into JSON, `model.model_dump_json(indent=2, exclude_unset=True)`
- Converting to a dictionary, `model.model_dump(exclude_unset=True)`

Typed requests and responses provide autocomplete and documentation within your editor. If you would like to see type errors in VS Code to help catch bugs earlier, set `python.analysis.typeCheckingMode` to `basic`.

## Token counting

You can estimate billing for a given request with the `client.count_tokens()` method, eg:

```py
client = AnthropicBedrock()
client.count_tokens('Hello world!')  # 3
```

## Handling errors

When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `anthropic_bedrock.APIConnectionError` is raised.

When the API returns a non-success status code (that is, 4xx or 5xx
response), a subclass of `anthropic_bedrock.APIStatusError` is raised, containing `status_code` and `response` properties.

All errors inherit from `anthropic_bedrock.APIError`.

```python
import anthropic_bedrock
from anthropic_bedrock import AnthropicBedrock

client = AnthropicBedrock()

try:
    client.completions.create(
        prompt=f"{anthropic_bedrock.HUMAN_PROMPT} Your prompt here {anthropic_bedrock.AI_PROMPT}",
        max_tokens_to_sample=256,
        model="anthropic.claude-v2:1",
    )
except anthropic_bedrock.APIConnectionError as e:
    print("The server could not be reached")
    print(e.__cause__)  # an underlying Exception, likely raised within httpx.
except anthropic_bedrock.RateLimitError as e:
    print("A 429 status code was received; we should back off a bit.")
except anthropic_bedrock.APIStatusError as e:
    print("Another non-200-range status code was received")
    print(e.status_code)
    print(e.response)
```

Error codes are as followed:

| Status Code | Error Type                 |
| ----------- | -------------------------- |
| 400         | `BadRequestError`          |
| 401         | `AuthenticationError`      |
| 403         | `PermissionDeniedError`    |
| 404         | `NotFoundError`            |
| 422         | `UnprocessableEntityError` |
| 429         | `RateLimitError`           |
| >=500       | `InternalServerError`      |
| N/A         | `APIConnectionError`       |

### Retries

Certain errors are automatically retried 2 times by default, with a short exponential backoff.
Connection errors (for example, due to a network connectivity problem), 408 Request Timeout, 409 Conflict,
429 Rate Limit, and >=500 Internal errors are all retried by default.

You can use the `max_retries` option to configure or disable retry settings:

```python
from anthropic_bedrock import AnthropicBedrock, HUMAN_PROMPT, AI_PROMPT

# Configure the default for all requests:
client = AnthropicBedrock(
    # default is 2
    max_retries=0,
)

# Or, configure per-request:
client.with_options(max_retries=5).completions.create(
    prompt=f"{HUMAN_PROMPT} Can you help me effectively ask for a raise at work?{AI_PROMPT}",
    max_tokens_to_sample=300,
    model="anthropic.claude-v2:1",
)
```

### Timeouts

By default requests time out after 10 minutes. You can configure this with a `timeout` option,
which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration) object:

```python
from anthropic_bedrock import AnthropicBedrock, HUMAN_PROMPT, AI_PROMPT

# Configure the default for all requests:
client = AnthropicBedrock(
    # default is 10 minutes
    timeout=20.0,
)

# More granular control:
client = AnthropicBedrock(
    timeout=httpx.Timeout(60.0, read=5.0, write=10.0, connect=2.0),
)

# Override per-request:
client.with_options(timeout=5 * 1000).completions.create(
    prompt=f"{HUMAN_PROMPT} Where can I get a good coffee in my neighbourhood?{AI_PROMPT}",
    max_tokens_to_sample=300,
    model="anthropic.claude-v2:1",
)
```

On timeout, an `APITimeoutError` is thrown.

Note that requests that time out are [retried twice by default](#retries).

## Advanced

### Logging

We use the standard library [`logging`](https://docs.python.org/3/library/logging.html) module.

You can enable logging by setting the environment variable `ANTHROPIC_BEDROCK_LOG` to `debug`.

```shell
$ export ANTHROPIC_BEDROCK_LOG=debug
```

### How to tell whether `None` means `null` or missing

In an API response, a field may be explicitly `null`, or missing entirely; in either case, its value is `None` in this library. You can differentiate the two cases with `.model_fields_set`:

```py
if response.my_field is None:
  if 'my_field' not in response.model_fields_set:
    print('Got json like {}, without a "my_field" key present at all.')
  else:
    print('Got json like {"my_field": null}.')
```

### Accessing raw response data (e.g. headers)

The "raw" Response object can be accessed by prefixing `.with_raw_response.` to any HTTP method call, e.g.,

```py
from anthropic_bedrock import AnthropicBedrock, HUMAN_PROMPT, AI_PROMPT

client = AnthropicBedrock()

response = client.completions.with_raw_response.create(
    prompt=f"{HUMAN_PROMPT} Your prompt here{AI_PROMPT}",
    max_tokens_to_sample=300,
    model="anthropic.claude-v2:1",
)
print(response.headers.get('X-My-Header'))

completion = response.parse()  # get the object that `completions.create()` would have returned
print(completion.completion)
```

These methods return an [`LegacyAPIResponse`](https://github.com/anthropics/anthropic-bedrock-python/tree/main/src/anthropic_bedrock/_legacy_response.py) object. This is a legacy class as we're changing it slightly in the next major version.

For the sync client this will mostly be the same with the exception
of `content` & `text` will be methods instead of properties. In the
async client, all methods will be async.

A migration script will be provided & the migration in general should
be smooth.

#### `.with_streaming_response`

The above interface eagerly reads the full response body when you make the request, which may not always be what you want.

To stream the response body, use `.with_streaming_response` instead, which requires a context manager and only reads the response body once you call `.read()`, `.text()`, `.json()`, `.iter_bytes()`, `.iter_text()`, `.iter_lines()` or `.parse()`. In the async client, these are async methods.

As such, `.with_streaming_response` methods return a different [`APIResponse`](https://github.com/anthropics/anthropic-bedrock-python/tree/main/src/anthropic_bedrock/_response.py) object, and the async client returns an [`AsyncAPIResponse`](https://github.com/anthropics/anthropic-bedrock-python/tree/main/src/anthropic_bedrock/_response.py) object.

```python
with client.completions.with_streaming_response.create(
    max_tokens_to_sample=300,
    model="claude-2.1",
    prompt=f"{HUMAN_PROMPT} Where can I get a good coffee in my neighbourhood?{AI_PROMPT}",
) as response:
    print(response.headers.get("X-My-Header"))

    for line in response.iter_lines():
        print(line)
```

The context manager is required so that the response will reliably be closed.

### Configuring the HTTP client

You can directly override the [httpx client](https://www.python-httpx.org/api/#client) to customize it for your use case, including:

- Support for proxies
- Custom transports
- Additional [advanced](https://www.python-httpx.org/advanced/#client-instances) functionality

```python
import httpx
from anthropic_bedrock import AnthropicBedrock

client = AnthropicBedrock(
    # Or use the `ANTHROPIC_BEDROCK_BASE_URL` env var
    base_url="http://my.test.server.example.com:8083",
    http_client=httpx.Client(
        proxies="http://my.test.proxy.example.com",
        transport=httpx.HTTPTransport(local_address="0.0.0.0"),
    ),
    aws_secret_key="<secret key>",
    aws_access_key="<access key>",
    aws_region="us-east-2",
)
```

### Managing HTTP resources

By default the library closes underlying HTTP connections whenever the client is [garbage collected](https://docs.python.org/3/reference/datamodel.html#object.__del__). You can manually close the client using the `.close()` method if desired, or with a context manager that closes when exiting.

## Versioning

This package generally follows [SemVer](https://semver.org/spec/v2.0.0.html) conventions, though certain backwards-incompatible changes may be released as minor versions:

1. Changes that only affect static types, without breaking runtime behavior.
2. Changes to library internals which are technically public but not intended or documented for external use. _(Please open a GitHub issue to let us know if you are relying on such internals)_.
3. Changes that we do not expect to impact the vast majority of users in practice.

We take backwards-compatibility seriously and work hard to ensure you can rely on a smooth upgrade experience.

We are keen for your feedback; please open an [issue](https://www.github.com/anthropics/anthropic-bedrock-python/issues) with questions, bugs, or suggestions.

## Requirements

Python 3.7 or higher.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "anthropic-bedrock",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "",
    "author": "",
    "author_email": "Anthropic Bedrock <dev-feedback@anthropic.com>",
    "download_url": "https://files.pythonhosted.org/packages/4b/38/afda2448ef9b4f00b5a19ea3b7c875864b73f53679ea92c3c2da4fc33ae1/anthropic_bedrock-0.8.0.tar.gz",
    "platform": null,
    "description": "# Anthropic Bedrock Python API library\n\n[![PyPI version](https://img.shields.io/pypi/v/anthropic-bedrock.svg)](https://pypi.org/project/anthropic-bedrock/)\n\nThe Anthropic Bedrock Python library provides convenient access to the [Anthropic Bedrock](https://docs.anthropic.com/claude/docs/claude-on-amazon-bedrock) REST API from any Python 3.7+\napplication. It includes type definitions for all request params and response fields,\nand offers both synchronous and asynchronous clients powered by [httpx](https://github.com/encode/httpx).\n\nFor the non-Bedrock Anthropic API at api.anthropic.com, see [`anthropic-python`](https://github.com/anthropics/anthropic-sdk-python).\n\n## Documentation\n\nThe REST API documentation can be found [on docs.anthropic.com](https://docs.anthropic.com/claude/reference/). The full API of this library can be found in [api.md](https://www.github.com/anthropics/anthropic-bedrock-python/blob/main/api.md).\n\n## Installation\n\n```sh\npip install anthropic-bedrock\n```\n\n## Usage\n\nThe full API of this library can be found in [api.md](https://www.github.com/anthropics/anthropic-bedrock-python/blob/main/api.md).\n\n```python\nimport anthropic_bedrock\nfrom anthropic_bedrock import AnthropicBedrock\n\nclient = AnthropicBedrock(\n    # Authenticate by either providing the keys below or use the default AWS credential providers, such as\n    # using ~/.aws/credentials or the \"AWS_SECRET_ACCESS_KEY\" and \"AWS_ACCESS_KEY_ID\" environment variables.\n    aws_access_key=\"<access key>\",\n    aws_secret_key=\"<secret key>\",\n    # Temporary credentials can be used with aws_session_token.\n    # Read more at https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp.html.\n    aws_session_token=\"<session_token>\",\n    # aws_region changes the aws region to which the request is made. By default, we read AWS_REGION,\n    # and if that's not present, we default to us-east-1. Note that we do not read ~/.aws/config for the region.\n    aws_region=\"us-east-2\",\n)\n\ncompletion = client.completions.create(\n    model=\"anthropic.claude-v2:1\",\n    max_tokens_to_sample=256,\n    prompt=f\"{anthropic_bedrock.HUMAN_PROMPT} how does a court case get to the Supreme Court? {anthropic_bedrock.AI_PROMPT}\",\n)\nprint(completion.completion)\n```\n\n> This library uses [botocore](https://github.com/boto/botocore) internally for authentication; you can read more about the default providers [here](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html).\n\n## Async usage\n\nSimply import `AsyncAnthropicBedrock` instead of `AnthropicBedrock` and use `await` with each API call:\n\n```python\nimport anthropic_bedrock\nfrom anthropic_bedrock import AsyncAnthropicBedrock\n\nclient = AsyncAnthropicBedrock()\n\n\nasync def main():\n    completion = await client.completions.create(\n        model=\"anthropic.claude-v2:1\",\n        max_tokens_to_sample=256,\n        prompt=f\"{anthropic_bedrock.HUMAN_PROMPT} how does a court case get to the Supreme Court? {anthropic_bedrock.AI_PROMPT}\",\n    )\n    print(completion.completion)\n\n\nasyncio.run(main())\n```\n\nFunctionality between the synchronous and asynchronous clients is otherwise identical.\n\n## Streaming Responses\n\nWe provide support for streaming responses using Server Side Events (SSE).\n\n```python\nfrom anthropic_bedrock import AnthropicBedrock, HUMAN_PROMPT, AI_PROMPT\n\nclient = AnthropicBedrock()\n\nstream = client.completions.create(\n    prompt=f\"{HUMAN_PROMPT} Your prompt here{AI_PROMPT}\",\n    max_tokens_to_sample=300,\n    model=\"anthropic.claude-v2:1\",\n    stream=True,\n)\nfor completion in stream:\n    print(completion.completion, end=\"\", flush=True)\n```\n\nThe async client uses the exact same interface.\n\n```python\nfrom anthropic_bedrock import AsyncAnthropicBedrock, HUMAN_PROMPT, AI_PROMPT\n\nclient = AsyncAnthropicBedrock()\n\nstream = await client.completions.create(\n    prompt=f\"{HUMAN_PROMPT} Your prompt here{AI_PROMPT}\",\n    max_tokens_to_sample=300,\n    model=\"anthropic.claude-v2:1\",\n    stream=True,\n)\nasync for completion in stream:\n    print(completion.completion, end=\"\", flush=True)\n```\n\n## Using types\n\nNested request parameters are [TypedDicts](https://docs.python.org/3/library/typing.html#typing.TypedDict). Responses are [Pydantic models](https://docs.pydantic.dev), which provide helper methods for things like:\n\n- Serializing back into JSON, `model.model_dump_json(indent=2, exclude_unset=True)`\n- Converting to a dictionary, `model.model_dump(exclude_unset=True)`\n\nTyped requests and responses provide autocomplete and documentation within your editor. If you would like to see type errors in VS Code to help catch bugs earlier, set `python.analysis.typeCheckingMode` to `basic`.\n\n## Token counting\n\nYou can estimate billing for a given request with the `client.count_tokens()` method, eg:\n\n```py\nclient = AnthropicBedrock()\nclient.count_tokens('Hello world!')  # 3\n```\n\n## Handling errors\n\nWhen the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `anthropic_bedrock.APIConnectionError` is raised.\n\nWhen the API returns a non-success status code (that is, 4xx or 5xx\nresponse), a subclass of `anthropic_bedrock.APIStatusError` is raised, containing `status_code` and `response` properties.\n\nAll errors inherit from `anthropic_bedrock.APIError`.\n\n```python\nimport anthropic_bedrock\nfrom anthropic_bedrock import AnthropicBedrock\n\nclient = AnthropicBedrock()\n\ntry:\n    client.completions.create(\n        prompt=f\"{anthropic_bedrock.HUMAN_PROMPT} Your prompt here {anthropic_bedrock.AI_PROMPT}\",\n        max_tokens_to_sample=256,\n        model=\"anthropic.claude-v2:1\",\n    )\nexcept anthropic_bedrock.APIConnectionError as e:\n    print(\"The server could not be reached\")\n    print(e.__cause__)  # an underlying Exception, likely raised within httpx.\nexcept anthropic_bedrock.RateLimitError as e:\n    print(\"A 429 status code was received; we should back off a bit.\")\nexcept anthropic_bedrock.APIStatusError as e:\n    print(\"Another non-200-range status code was received\")\n    print(e.status_code)\n    print(e.response)\n```\n\nError codes are as followed:\n\n| Status Code | Error Type                 |\n| ----------- | -------------------------- |\n| 400         | `BadRequestError`          |\n| 401         | `AuthenticationError`      |\n| 403         | `PermissionDeniedError`    |\n| 404         | `NotFoundError`            |\n| 422         | `UnprocessableEntityError` |\n| 429         | `RateLimitError`           |\n| >=500       | `InternalServerError`      |\n| N/A         | `APIConnectionError`       |\n\n### Retries\n\nCertain errors are automatically retried 2 times by default, with a short exponential backoff.\nConnection errors (for example, due to a network connectivity problem), 408 Request Timeout, 409 Conflict,\n429 Rate Limit, and >=500 Internal errors are all retried by default.\n\nYou can use the `max_retries` option to configure or disable retry settings:\n\n```python\nfrom anthropic_bedrock import AnthropicBedrock, HUMAN_PROMPT, AI_PROMPT\n\n# Configure the default for all requests:\nclient = AnthropicBedrock(\n    # default is 2\n    max_retries=0,\n)\n\n# Or, configure per-request:\nclient.with_options(max_retries=5).completions.create(\n    prompt=f\"{HUMAN_PROMPT} Can you help me effectively ask for a raise at work?{AI_PROMPT}\",\n    max_tokens_to_sample=300,\n    model=\"anthropic.claude-v2:1\",\n)\n```\n\n### Timeouts\n\nBy default requests time out after 10 minutes. You can configure this with a `timeout` option,\nwhich accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/#fine-tuning-the-configuration) object:\n\n```python\nfrom anthropic_bedrock import AnthropicBedrock, HUMAN_PROMPT, AI_PROMPT\n\n# Configure the default for all requests:\nclient = AnthropicBedrock(\n    # default is 10 minutes\n    timeout=20.0,\n)\n\n# More granular control:\nclient = AnthropicBedrock(\n    timeout=httpx.Timeout(60.0, read=5.0, write=10.0, connect=2.0),\n)\n\n# Override per-request:\nclient.with_options(timeout=5 * 1000).completions.create(\n    prompt=f\"{HUMAN_PROMPT} Where can I get a good coffee in my neighbourhood?{AI_PROMPT}\",\n    max_tokens_to_sample=300,\n    model=\"anthropic.claude-v2:1\",\n)\n```\n\nOn timeout, an `APITimeoutError` is thrown.\n\nNote that requests that time out are [retried twice by default](#retries).\n\n## Advanced\n\n### Logging\n\nWe use the standard library [`logging`](https://docs.python.org/3/library/logging.html) module.\n\nYou can enable logging by setting the environment variable `ANTHROPIC_BEDROCK_LOG` to `debug`.\n\n```shell\n$ export ANTHROPIC_BEDROCK_LOG=debug\n```\n\n### How to tell whether `None` means `null` or missing\n\nIn an API response, a field may be explicitly `null`, or missing entirely; in either case, its value is `None` in this library. You can differentiate the two cases with `.model_fields_set`:\n\n```py\nif response.my_field is None:\n  if 'my_field' not in response.model_fields_set:\n    print('Got json like {}, without a \"my_field\" key present at all.')\n  else:\n    print('Got json like {\"my_field\": null}.')\n```\n\n### Accessing raw response data (e.g. headers)\n\nThe \"raw\" Response object can be accessed by prefixing `.with_raw_response.` to any HTTP method call, e.g.,\n\n```py\nfrom anthropic_bedrock import AnthropicBedrock, HUMAN_PROMPT, AI_PROMPT\n\nclient = AnthropicBedrock()\n\nresponse = client.completions.with_raw_response.create(\n    prompt=f\"{HUMAN_PROMPT} Your prompt here{AI_PROMPT}\",\n    max_tokens_to_sample=300,\n    model=\"anthropic.claude-v2:1\",\n)\nprint(response.headers.get('X-My-Header'))\n\ncompletion = response.parse()  # get the object that `completions.create()` would have returned\nprint(completion.completion)\n```\n\nThese methods return an [`LegacyAPIResponse`](https://github.com/anthropics/anthropic-bedrock-python/tree/main/src/anthropic_bedrock/_legacy_response.py) object. This is a legacy class as we're changing it slightly in the next major version.\n\nFor the sync client this will mostly be the same with the exception\nof `content` & `text` will be methods instead of properties. In the\nasync client, all methods will be async.\n\nA migration script will be provided & the migration in general should\nbe smooth.\n\n#### `.with_streaming_response`\n\nThe above interface eagerly reads the full response body when you make the request, which may not always be what you want.\n\nTo stream the response body, use `.with_streaming_response` instead, which requires a context manager and only reads the response body once you call `.read()`, `.text()`, `.json()`, `.iter_bytes()`, `.iter_text()`, `.iter_lines()` or `.parse()`. In the async client, these are async methods.\n\nAs such, `.with_streaming_response` methods return a different [`APIResponse`](https://github.com/anthropics/anthropic-bedrock-python/tree/main/src/anthropic_bedrock/_response.py) object, and the async client returns an [`AsyncAPIResponse`](https://github.com/anthropics/anthropic-bedrock-python/tree/main/src/anthropic_bedrock/_response.py) object.\n\n```python\nwith client.completions.with_streaming_response.create(\n    max_tokens_to_sample=300,\n    model=\"claude-2.1\",\n    prompt=f\"{HUMAN_PROMPT} Where can I get a good coffee in my neighbourhood?{AI_PROMPT}\",\n) as response:\n    print(response.headers.get(\"X-My-Header\"))\n\n    for line in response.iter_lines():\n        print(line)\n```\n\nThe context manager is required so that the response will reliably be closed.\n\n### Configuring the HTTP client\n\nYou can directly override the [httpx client](https://www.python-httpx.org/api/#client) to customize it for your use case, including:\n\n- Support for proxies\n- Custom transports\n- Additional [advanced](https://www.python-httpx.org/advanced/#client-instances) functionality\n\n```python\nimport httpx\nfrom anthropic_bedrock import AnthropicBedrock\n\nclient = AnthropicBedrock(\n    # Or use the `ANTHROPIC_BEDROCK_BASE_URL` env var\n    base_url=\"http://my.test.server.example.com:8083\",\n    http_client=httpx.Client(\n        proxies=\"http://my.test.proxy.example.com\",\n        transport=httpx.HTTPTransport(local_address=\"0.0.0.0\"),\n    ),\n    aws_secret_key=\"<secret key>\",\n    aws_access_key=\"<access key>\",\n    aws_region=\"us-east-2\",\n)\n```\n\n### Managing HTTP resources\n\nBy default the library closes underlying HTTP connections whenever the client is [garbage collected](https://docs.python.org/3/reference/datamodel.html#object.__del__). You can manually close the client using the `.close()` method if desired, or with a context manager that closes when exiting.\n\n## Versioning\n\nThis package generally follows [SemVer](https://semver.org/spec/v2.0.0.html) conventions, though certain backwards-incompatible changes may be released as minor versions:\n\n1. Changes that only affect static types, without breaking runtime behavior.\n2. Changes to library internals which are technically public but not intended or documented for external use. _(Please open a GitHub issue to let us know if you are relying on such internals)_.\n3. Changes that we do not expect to impact the vast majority of users in practice.\n\nWe take backwards-compatibility seriously and work hard to ensure you can rely on a smooth upgrade experience.\n\nWe are keen for your feedback; please open an [issue](https://www.github.com/anthropics/anthropic-bedrock-python/issues) with questions, bugs, or suggestions.\n\n## Requirements\n\nPython 3.7 or higher.\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "The official Python library for the anthropic-bedrock API",
    "version": "0.8.0",
    "project_urls": {
        "Homepage": "https://github.com/anthropics/anthropic-bedrock-python",
        "Repository": "https://github.com/anthropics/anthropic-bedrock-python"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1b025dea7993d70026da631ddf13547484cd1a4766775dd7e8c89bf1f3602337",
                "md5": "8bae33b6a2b89127c4c2d27c268e0f80",
                "sha256": "76f4105d770b262b732de23f9d63cbf5c9f076cd53724d5fc3c76189fee81450"
            },
            "downloads": -1,
            "filename": "anthropic_bedrock-0.8.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8bae33b6a2b89127c4c2d27c268e0f80",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 820186,
            "upload_time": "2024-01-31T00:22:54",
            "upload_time_iso_8601": "2024-01-31T00:22:54.969473Z",
            "url": "https://files.pythonhosted.org/packages/1b/02/5dea7993d70026da631ddf13547484cd1a4766775dd7e8c89bf1f3602337/anthropic_bedrock-0.8.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4b38afda2448ef9b4f00b5a19ea3b7c875864b73f53679ea92c3c2da4fc33ae1",
                "md5": "f44b05d4fad6a0d41e43e98ef4a74adb",
                "sha256": "376a8e3260875b68d9ff803e78c808a9d9af7a7f71d66ebadabf8b09f91644fb"
            },
            "downloads": -1,
            "filename": "anthropic_bedrock-0.8.0.tar.gz",
            "has_sig": false,
            "md5_digest": "f44b05d4fad6a0d41e43e98ef4a74adb",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 798466,
            "upload_time": "2024-01-31T00:22:56",
            "upload_time_iso_8601": "2024-01-31T00:22:56.690843Z",
            "url": "https://files.pythonhosted.org/packages/4b/38/afda2448ef9b4f00b5a19ea3b7c875864b73f53679ea92c3c2da4fc33ae1/anthropic_bedrock-0.8.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-31 00:22:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "anthropics",
    "github_project": "anthropic-bedrock-python",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "anthropic-bedrock"
}
        
Elapsed time: 0.53893s