The simple module for putting and getting object from Amazon S3 compatible endpoints
Project description
aiohttp-s3-client
This module is the simplest way to enable compression support for aiohttp
server applications globally.
Installation
pip install aiohttp-s3-client
Usage
from http import HTTPStatus
from aiohttp import ClientSession
from aiohttp_s3_client import S3Client
async with ClientSession(raise_for_status=True) as session:
client = S3Client(
url="http://s3-url",
session=session,
access_key_id="key-id",
secret_access_key="hackme",
region="us-east-1"
)
# Upload str object to bucket "bucket" and key "str"
async with client.put("bucket/str", "hello, world") as resp:
assert resp.status == HTTPStatus.OK
# Upload bytes object to bucket "bucket" and key "bytes"
resp = await client.put("bucket/bytes", b"hello, world")
assert resp.status == HTTPStatus.OK
# Upload AsyncIterable to bucket "bucket" and key "iterable"
async def gen():
yield b'some bytes'
resp = await client.put("bucket/file", gen())
assert resp.status == HTTPStatus.OK
# Upload file to bucket "bucket" and key "file"
resp = await client.put_file("bucket/file", "/path_to_file")
assert resp.status == HTTPStatus.OK
# Get object by bucket+key
resp = await client.get("bucket/key")
data = await resp.read()
Bucket may be specified as subdomain or in object name:
client = S3Client(url="http://bucket.your-s3-host", ...)
resp = await client.put("key", gen())
client = S3Client(url="http://your-s3-host", ...)
resp = await client.put("bucket/key", gen())
Auth may be specified with keywords or in URL:
client = S3Client(url="http://your-s3-host", access_key_id="key_id",
secret_access_key="access_key", ...)
client = S3Client(url="http://key_id:access_key@your-s3-host", ...)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.