aiohttp-s3-client
================
[![PyPI - License](https://img.shields.io/pypi/l/aiohttp-s3-client)](https://pypi.org/project/aiohttp-s3-client) [![Wheel](https://img.shields.io/pypi/wheel/aiohttp-s3-client)](https://pypi.org/project/aiohttp-s3-client) [![Mypy](http://www.mypy-lang.org/static/mypy_badge.svg)]() [![PyPI](https://img.shields.io/pypi/v/aiohttp-s3-client)](https://pypi.org/project/aiohttp-s3-client) [![PyPI](https://img.shields.io/pypi/pyversions/aiohttp-s3-client)](https://pypi.org/project/aiohttp-s3-client) [![Coverage Status](https://coveralls.io/repos/github/mosquito/aiohttp-s3-client/badge.svg?branch=master)](https://coveralls.io/github/mosquito/aiohttp-s3-client?branch=master) ![tox](https://github.com/mosquito/aiohttp-s3-client/workflows/tox/badge.svg?branch=master)
The simple module for putting and getting object from Amazon S3 compatible endpoints
## Installation
```bash
pip install aiohttp-s3-client
```
## Usage
```python
from http import HTTPStatus
from aiohttp import ClientSession
from aiohttp_s3_client import S3Client
async with ClientSession(raise_for_status=True) as session:
client = S3Client(
url="http://s3-url",
session=session,
access_key_id="key-id",
secret_access_key="hackme",
region="us-east-1"
)
# Upload str object to bucket "bucket" and key "str"
async with client.put("bucket/str", "hello, world") as resp:
assert resp.status == HTTPStatus.OK
# Upload bytes object to bucket "bucket" and key "bytes"
async with await client.put("bucket/bytes", b"hello, world") as resp:
assert resp.status == HTTPStatus.OK
# Upload AsyncIterable to bucket "bucket" and key "iterable"
async def gen():
yield b'some bytes'
async with client.put("bucket/file", gen()) as resp:
assert resp.status == HTTPStatus.OK
# Upload file to bucket "bucket" and key "file"
async with client.put_file("bucket/file", "/path_to_file") as resp:
assert resp.status == HTTPStatus.OK
# Check object exists using bucket+key
async with client.head("bucket/key") as resp:
assert resp == HTTPStatus.OK
# Get object by bucket+key
async with client.get("bucket/key") as resp:
data = await resp.read()
# Make presigned URL
url = client.presign_url("GET", "bucket/key", expires=60 * 60)
# Delete object using bucket+key
async with client.delete("bucket/key") as resp:
assert resp == HTTPStatus.NO_CONTENT
# List objects by prefix
async for result, prefixes in client.list_objects_v2("bucket/", prefix="prefix"):
# Each result is a list of metadata objects representing an object
# stored in the bucket. Each prefixes is a list of common prefixes
do_work(result, prefixes)
```
Bucket may be specified as subdomain or in object name:
```python
import aiohttp
from aiohttp_s3_client import S3Client
client = S3Client(url="http://bucket.your-s3-host",
session=aiohttp.ClientSession())
async with client.put("key", gen()) as resp:
...
client = S3Client(url="http://your-s3-host",
session=aiohttp.ClientSession())
async with await client.put("bucket/key", gen()) as resp:
...
client = S3Client(url="http://your-s3-host/bucket",
session=aiohttp.ClientSession())
async with client.put("key", gen()) as resp:
...
```
Auth may be specified with keywords or in URL:
```python
import aiohttp
from aiohttp_s3_client import S3Client
client_credentials_as_kw = S3Client(
url="http://your-s3-host",
access_key_id="key_id",
secret_access_key="access_key",
session=aiohttp.ClientSession(),
)
client_credentials_in_url = S3Client(
url="http://key_id:access_key@your-s3-host",
session=aiohttp.ClientSession(),
)
```
## Credentials
By default `S3Client` trying to collect all available credentials from keyword
arguments like `access_key_id=` and `secret_access_key=`, after that from the
username and password from passed `url` argument, so the next step is environment
variables parsing and the last source for collection is the config file.
You can pass credentials explicitly using `aiohttp_s3_client.credentials`
module.
### `aiohttp_s3_client.credentials.StaticCredentials`
```python
import aiohttp
from aiohttp_s3_client import S3Client
from aiohttp_s3_client.credentials import StaticCredentials
credentials = StaticCredentials(
access_key_id='aaaa',
secret_access_key='bbbb',
region='us-east-1',
)
client = S3Client(
url="http://your-s3-host",
session=aiohttp.ClientSession(),
credentials=credentials,
)
```
### `aiohttp_s3_client.credentials.URLCredentials`
```python
import aiohttp
from aiohttp_s3_client import S3Client
from aiohttp_s3_client.credentials import URLCredentials
url = "http://key@hack-me:your-s3-host"
credentials = URLCredentials(url, region="us-east-1")
client = S3Client(
url="http://your-s3-host",
session=aiohttp.ClientSession(),
credentials=credentials,
)
```
### `aiohttp_s3_client.credentials.EnvironmentCredentials`
```python
import aiohttp
from aiohttp_s3_client import S3Client
from aiohttp_s3_client.credentials import EnvironmentCredentials
credentials = EnvironmentCredentials(region="us-east-1")
client = S3Client(
url="http://your-s3-host",
session=aiohttp.ClientSession(),
credentials=credentials,
)
```
### `aiohttp_s3_client.credentials.ConfigCredentials`
Using user config file:
```python
import aiohttp
from aiohttp_s3_client import S3Client
from aiohttp_s3_client.credentials import ConfigCredentials
credentials = ConfigCredentials() # Will be used ~/.aws/credentials config
client = S3Client(
url="http://your-s3-host",
session=aiohttp.ClientSession(),
credentials=credentials,
)
```
Using the custom config location:
```python
import aiohttp
from aiohttp_s3_client import S3Client
from aiohttp_s3_client.credentials import ConfigCredentials
credentials = ConfigCredentials("~/.my-custom-aws-credentials")
client = S3Client(
url="http://your-s3-host",
session=aiohttp.ClientSession(),
credentials=credentials,
)
```
### `aiohttp_s3_client.credentials.merge_credentials`
This function collect all passed credentials instances and return a new one
which contains all non-blank fields from passed instances. The first argument
has more priority.
```python
import aiohttp
from aiohttp_s3_client import S3Client
from aiohttp_s3_client.credentials import (
ConfigCredentials, EnvironmentCredentials, merge_credentials
)
credentials = merge_credentials(
EnvironmentCredentials(),
ConfigCredentials(),
)
client = S3Client(
url="http://your-s3-host",
session=aiohttp.ClientSession(),
credentials=credentials,
)
```
### `aiohttp_s3_client.credentials.MetadataCredentials`
Trying to get credentials from the metadata service:
```python
import aiohttp
from aiohttp_s3_client import S3Client
from aiohttp_s3_client.credentials import MetadataCredentials
credentials = MetadataCredentials()
# start refresh credentials from metadata server
await credentials.start()
client = S3Client(
url="http://your-s3-host",
session=aiohttp.ClientSession(),
)
await credentials.stop()
```
## Multipart upload
For uploading large files [multipart uploading](https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html)
can be used. It allows you to asynchronously upload multiple parts of a file
to S3.
S3Client handles retries of part uploads and calculates part hash for integrity checks.
```python
import aiohttp
from aiohttp_s3_client import S3Client
client = S3Client(url="http://your-s3-host", session=aiohttp.ClientSession())
await client.put_file_multipart(
"test/bigfile.csv",
headers={
"Content-Type": "text/csv",
},
workers_count=8,
)
```
## Parallel download to file
S3 supports `GET` requests with `Range` header. It's possible to download
objects in parallel with multiple connections for speedup.
S3Client handles retries of partial requests and makes sure that file won't
be changed during download with `ETag` header.
If your system supports `pwrite` syscall (Linux, macOS, etc.) it will be used to
write simultaneously to a single file. Otherwise, each worker will have own file
which will be concatenated after downloading.
```python
import aiohttp
from aiohttp_s3_client import S3Client
client = S3Client(url="http://your-s3-host", session=aiohttp.ClientSession())
await client.get_file_parallel(
"dump/bigfile.csv",
"/home/user/bigfile.csv",
workers_count=8,
)
```
Raw data
{
"_id": null,
"home_page": "https://github.com/aiokitchen/aiohttp-s3-client",
"name": "aiohttp-s3-client",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Dmitry Orlov",
"author_email": "me@mosquito.su",
"download_url": "https://files.pythonhosted.org/packages/cf/0b/9b72ed02a6db6e1680d35d570dedd869fe8f2fd758f91ce805cd5d4b78f1/aiohttp_s3_client-1.0.0.tar.gz",
"platform": null,
"description": "aiohttp-s3-client\n================\n\n[![PyPI - License](https://img.shields.io/pypi/l/aiohttp-s3-client)](https://pypi.org/project/aiohttp-s3-client) [![Wheel](https://img.shields.io/pypi/wheel/aiohttp-s3-client)](https://pypi.org/project/aiohttp-s3-client) [![Mypy](http://www.mypy-lang.org/static/mypy_badge.svg)]() [![PyPI](https://img.shields.io/pypi/v/aiohttp-s3-client)](https://pypi.org/project/aiohttp-s3-client) [![PyPI](https://img.shields.io/pypi/pyversions/aiohttp-s3-client)](https://pypi.org/project/aiohttp-s3-client) [![Coverage Status](https://coveralls.io/repos/github/mosquito/aiohttp-s3-client/badge.svg?branch=master)](https://coveralls.io/github/mosquito/aiohttp-s3-client?branch=master) ![tox](https://github.com/mosquito/aiohttp-s3-client/workflows/tox/badge.svg?branch=master)\n\nThe simple module for putting and getting object from Amazon S3 compatible endpoints\n\n## Installation\n\n```bash\npip install aiohttp-s3-client\n```\n\n## Usage\n\n```python\nfrom http import HTTPStatus\n\nfrom aiohttp import ClientSession\nfrom aiohttp_s3_client import S3Client\n\n\nasync with ClientSession(raise_for_status=True) as session:\n client = S3Client(\n url=\"http://s3-url\",\n session=session,\n access_key_id=\"key-id\",\n secret_access_key=\"hackme\",\n region=\"us-east-1\"\n )\n\n # Upload str object to bucket \"bucket\" and key \"str\"\n async with client.put(\"bucket/str\", \"hello, world\") as resp:\n assert resp.status == HTTPStatus.OK\n\n # Upload bytes object to bucket \"bucket\" and key \"bytes\"\n async with await client.put(\"bucket/bytes\", b\"hello, world\") as resp:\n assert resp.status == HTTPStatus.OK\n\n # Upload AsyncIterable to bucket \"bucket\" and key \"iterable\"\n async def gen():\n yield b'some bytes'\n\n async with client.put(\"bucket/file\", gen()) as resp:\n assert resp.status == HTTPStatus.OK\n\n # Upload file to bucket \"bucket\" and key \"file\"\n async with client.put_file(\"bucket/file\", \"/path_to_file\") as resp:\n assert resp.status == HTTPStatus.OK\n\n # Check object exists using bucket+key\n async with client.head(\"bucket/key\") as resp:\n assert resp == HTTPStatus.OK\n\n # Get object by bucket+key\n async with client.get(\"bucket/key\") as resp:\n data = await resp.read()\n\n # Make presigned URL\n url = client.presign_url(\"GET\", \"bucket/key\", expires=60 * 60)\n\n # Delete object using bucket+key\n async with client.delete(\"bucket/key\") as resp:\n assert resp == HTTPStatus.NO_CONTENT\n\n # List objects by prefix\n async for result, prefixes in client.list_objects_v2(\"bucket/\", prefix=\"prefix\"):\n # Each result is a list of metadata objects representing an object\n # stored in the bucket. Each prefixes is a list of common prefixes\n do_work(result, prefixes)\n```\n\nBucket may be specified as subdomain or in object name:\n\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\n\n\nclient = S3Client(url=\"http://bucket.your-s3-host\",\n session=aiohttp.ClientSession())\nasync with client.put(\"key\", gen()) as resp:\n ...\n\nclient = S3Client(url=\"http://your-s3-host\",\n session=aiohttp.ClientSession())\nasync with await client.put(\"bucket/key\", gen()) as resp:\n ...\n\nclient = S3Client(url=\"http://your-s3-host/bucket\",\n session=aiohttp.ClientSession())\nasync with client.put(\"key\", gen()) as resp:\n ...\n```\n\nAuth may be specified with keywords or in URL:\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\n\nclient_credentials_as_kw = S3Client(\n url=\"http://your-s3-host\",\n access_key_id=\"key_id\",\n secret_access_key=\"access_key\",\n session=aiohttp.ClientSession(),\n)\n\nclient_credentials_in_url = S3Client(\n url=\"http://key_id:access_key@your-s3-host\",\n session=aiohttp.ClientSession(),\n)\n```\n\n## Credentials\n\nBy default `S3Client` trying to collect all available credentials from keyword\narguments like `access_key_id=` and `secret_access_key=`, after that from the\nusername and password from passed `url` argument, so the next step is environment\nvariables parsing and the last source for collection is the config file.\n\nYou can pass credentials explicitly using `aiohttp_s3_client.credentials`\nmodule.\n\n### `aiohttp_s3_client.credentials.StaticCredentials`\n\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\nfrom aiohttp_s3_client.credentials import StaticCredentials\n\ncredentials = StaticCredentials(\n access_key_id='aaaa',\n secret_access_key='bbbb',\n region='us-east-1',\n)\nclient = S3Client(\n url=\"http://your-s3-host\",\n session=aiohttp.ClientSession(),\n credentials=credentials,\n)\n```\n\n### `aiohttp_s3_client.credentials.URLCredentials`\n\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\nfrom aiohttp_s3_client.credentials import URLCredentials\n\nurl = \"http://key@hack-me:your-s3-host\"\ncredentials = URLCredentials(url, region=\"us-east-1\")\nclient = S3Client(\n url=\"http://your-s3-host\",\n session=aiohttp.ClientSession(),\n credentials=credentials,\n)\n```\n\n### `aiohttp_s3_client.credentials.EnvironmentCredentials`\n\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\nfrom aiohttp_s3_client.credentials import EnvironmentCredentials\n\ncredentials = EnvironmentCredentials(region=\"us-east-1\")\nclient = S3Client(\n url=\"http://your-s3-host\",\n session=aiohttp.ClientSession(),\n credentials=credentials,\n)\n```\n\n### `aiohttp_s3_client.credentials.ConfigCredentials`\n\nUsing user config file:\n\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\nfrom aiohttp_s3_client.credentials import ConfigCredentials\n\n\ncredentials = ConfigCredentials() # Will be used ~/.aws/credentials config\nclient = S3Client(\n url=\"http://your-s3-host\",\n session=aiohttp.ClientSession(),\n credentials=credentials,\n)\n```\n\nUsing the custom config location:\n\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\nfrom aiohttp_s3_client.credentials import ConfigCredentials\n\n\ncredentials = ConfigCredentials(\"~/.my-custom-aws-credentials\")\nclient = S3Client(\n url=\"http://your-s3-host\",\n session=aiohttp.ClientSession(),\n credentials=credentials,\n)\n```\n\n### `aiohttp_s3_client.credentials.merge_credentials`\n\nThis function collect all passed credentials instances and return a new one\nwhich contains all non-blank fields from passed instances. The first argument\nhas more priority.\n\n\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\nfrom aiohttp_s3_client.credentials import (\n ConfigCredentials, EnvironmentCredentials, merge_credentials\n)\n\ncredentials = merge_credentials(\n EnvironmentCredentials(),\n ConfigCredentials(),\n)\nclient = S3Client(\n url=\"http://your-s3-host\",\n session=aiohttp.ClientSession(),\n credentials=credentials,\n)\n```\n\n\n### `aiohttp_s3_client.credentials.MetadataCredentials`\n\nTrying to get credentials from the metadata service:\n\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\nfrom aiohttp_s3_client.credentials import MetadataCredentials\n\ncredentials = MetadataCredentials()\n\n# start refresh credentials from metadata server\nawait credentials.start()\nclient = S3Client(\n url=\"http://your-s3-host\",\n session=aiohttp.ClientSession(),\n)\nawait credentials.stop()\n```\n\n## Multipart upload\n\nFor uploading large files [multipart uploading](https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html)\ncan be used. It allows you to asynchronously upload multiple parts of a file\nto S3.\nS3Client handles retries of part uploads and calculates part hash for integrity checks.\n\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\n\n\nclient = S3Client(url=\"http://your-s3-host\", session=aiohttp.ClientSession())\nawait client.put_file_multipart(\n \"test/bigfile.csv\",\n headers={\n \"Content-Type\": \"text/csv\",\n },\n workers_count=8,\n)\n```\n\n## Parallel download to file\n\nS3 supports `GET` requests with `Range` header. It's possible to download\nobjects in parallel with multiple connections for speedup.\nS3Client handles retries of partial requests and makes sure that file won't\nbe changed during download with `ETag` header.\nIf your system supports `pwrite` syscall (Linux, macOS, etc.) it will be used to\nwrite simultaneously to a single file. Otherwise, each worker will have own file\nwhich will be concatenated after downloading.\n\n```python\nimport aiohttp\nfrom aiohttp_s3_client import S3Client\n\n\nclient = S3Client(url=\"http://your-s3-host\", session=aiohttp.ClientSession())\n\nawait client.get_file_parallel(\n \"dump/bigfile.csv\",\n \"/home/user/bigfile.csv\",\n workers_count=8,\n)\n```\n",
"bugtrack_url": null,
"license": "Apache Software License",
"summary": "The simple module for putting and getting object from Amazon S3 compatible endpoints",
"version": "1.0.0",
"project_urls": {
"Homepage": "https://github.com/aiokitchen/aiohttp-s3-client",
"Source": "https://github.com/aiokitchen/aiohttp-s3-client",
"Tracker": "https://github.com/aiokitchen/aiohttp-s3-client/issues"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "5bf85123090082290f5bc10860fec65c9fcd87823e5940b1804ca69b44e535e9",
"md5": "c89315581b97fcbe3c74400e8a2ddf5f",
"sha256": "c0714e863cd97dc91da302bbf827147f44dedfa6d961884f7bd6d189b42f858d"
},
"downloads": -1,
"filename": "aiohttp_s3_client-1.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c89315581b97fcbe3c74400e8a2ddf5f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8",
"size": 22740,
"upload_time": "2024-05-07T10:22:33",
"upload_time_iso_8601": "2024-05-07T10:22:33.806466Z",
"url": "https://files.pythonhosted.org/packages/5b/f8/5123090082290f5bc10860fec65c9fcd87823e5940b1804ca69b44e535e9/aiohttp_s3_client-1.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "cf0b9b72ed02a6db6e1680d35d570dedd869fe8f2fd758f91ce805cd5d4b78f1",
"md5": "ad2d5e972c50bc03f265e52672684809",
"sha256": "e3991f5bdeef775c9b9ca7bb06d4178ff13ff2e9c4cbca4e0d36c3ae24f531f6"
},
"downloads": -1,
"filename": "aiohttp_s3_client-1.0.0.tar.gz",
"has_sig": false,
"md5_digest": "ad2d5e972c50bc03f265e52672684809",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8",
"size": 20110,
"upload_time": "2024-05-07T10:22:35",
"upload_time_iso_8601": "2024-05-07T10:22:35.586175Z",
"url": "https://files.pythonhosted.org/packages/cf/0b/9b72ed02a6db6e1680d35d570dedd869fe8f2fd758f91ce805cd5d4b78f1/aiohttp_s3_client-1.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-05-07 10:22:35",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "aiokitchen",
"github_project": "aiohttp-s3-client",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "aiohttp-s3-client"
}