asyncpg-lostream


Nameasyncpg-lostream JSON
Version 0.1.2 PyPI version JSON
download
home_pagehttps://github.com/Red-HAP/asyncpg-lostream
SummaryCRUD on PostgreSQL large objects using async drivers and asyncio. Data are read and written in chunks.
upload_time2022-12-06 05:18:45
maintainerHAP
docs_urlNone
authorRed Hat, Inc.
requires_python>=3.9
licenseApache-2.0
keywords asyncio asyncpg sqlalchemy
VCS
bugtrack_url
requirements anyio asyncio faker pre-commit asyncpg psycopg2-binary sqlalchemy pytest pytest-asyncio twine build wheel configparser packaging
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # asyncpg-lostream

CRUD on PostgreSQL large objects using async drivers and asyncio. It is tested against SQLAlchemy.

## Purpose

Synchronous drivers such as `psycopg2` support large objects via a file-like interface utilizing libpg. Unfortunately, async drivers such as `asyncpg` do not support large objects at all (at this time). This module is designed to provide an interface for PostgreSQL large objects that is compatible with asyncio and async database drivers. This interface is achieved by calling the supporting PostgreSQL stored functions that are an interface to the large object store.

This interface is designed to operate on data by reading and writing in chunks so as to stream the data to and from the large object store.

This codebase is not tied or affiliated with asyncpg. It does utilize SQLAlchemy's AsyncConnection class in its typing.

## PostgreSQL Large Objects

A large object is a special store in PostgreSQL that operates similarly to a file. The large object itself is the record. The database data type is `bytea` (`bytes` in Python). The interface operates as a read and write to the allocated large object via read and write functions. (There are more, but they are out of scope for this project.) The data type of the large object id is `oid`. The tables used by PostgreSQL are `pg_largeobject` and `pg_largeobject_metadata`. The `pg_largeobject` table holds the data itself. The `pg_largeobject_metadata` has a link to the owner oid. These two tables are linked by the `oid` reference.

When assiciating the large object to a table record, add an `oid` type column to hold the allocated large object `oid` value from a created large object.

See the PostgreSQL documentation [here](https://www.postgresql.org/docs/current/largeobjects.html).

## Utilization

### Explicit Create

```python
from asyncpg_lostream.lostream import PGLargeObject

# It is the responsibility of the caller to resolve how an
# AsyncEngine is created and how an AsyncConnection is created.

# Create a large object
lob_oid = await PGLargeObject.create_large_object(db)

# Open a large object for read and write
pgl = PGLargeObject(async_connection, lob_oid, mode="rw")

with open("my_data.txt", "r") as data:
    for buff in data.read(pgl.chunk_size):
        await pgl.write(buff.encode)

pgl.close()
```

### Context Manager Create

```python
from asyncpg_lostream.lostream import PGLargeObject

# It is the responsibility of the caller to resolve how an
# AsyncEngine is created and how an AsyncConnection is created.

with open("my_data.txt", "r") as data:
    async with PGLargeObject(async_connection, 0, "w") as pgl:
        for buff in data.read(pgl.chunk_size):
            await pgl.write(buff.encode())
```

### Context manager read

```python
from asyncpg_lostream.lostream import PGLargeObject

# It is the responsibility of the caller to resolve how an
# AsyncEngine is created and how an AsyncConnection is created.

async with PGLargeObject(async_connection, existing_lob_oid, "r") as pgl:
    async for buff in pgl:
        print(buff.decode())
```

## Development

### Environment

1. Create a virtual environment
    ```bash
    python -m venv /path/to/venv
    ```
2. Activate the virtual environment
    ```bash
    source /path/to/venv/bin/activate
    ```
3. Ensure pip is up to date
    ```bash
    pip install --upgrade pip
    ```
4. Install packages from `requirements.txt`
    ```bash
    pip install -r requirements.txt
    ```
5. Install `pre-config`
   ```bash
   pre-config install
   ```

### Development

`make`, `docker-compose` and `docker` are required for development.

To list the make targets, use `make help`

To start a local PostgreSQL 13 container: `make start_db`

To shutdown the local PostgreSQL 13 container: `make stop_db`

After making changes, create your unit tests in the `asyncpg-lostream/tests` directory.

Test your changes with the command `make test`

## Packaging

If you intend to make a releaase, change the version in the `setup.cfg` file. This value will be copied to the module's `__version__` file.

Build the package using `make build`. This will run the tests and then build the artifacts. These will be put into the `dist/` directory.

For instructions on upload to PyPI, see the [Packaging Python Projects Dcoumentation](https://packaging.python.org/en/latest/tutorials/packaging-projects/#uploading-the-distribution-archives)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Red-HAP/asyncpg-lostream",
    "name": "asyncpg-lostream",
    "maintainer": "HAP",
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "hproctor@redhat.com",
    "keywords": "asyncio,asyncpg,SQLAlchemy",
    "author": "Red Hat, Inc.",
    "author_email": "info@ansible.com",
    "download_url": "https://files.pythonhosted.org/packages/93/17/4dbab3eec41efde274b7903abad9bdea2db144fe77a2eeb370e805d62003/asyncpg-lostream-0.1.2.tar.gz",
    "platform": null,
    "description": "# asyncpg-lostream\n\nCRUD on PostgreSQL large objects using async drivers and asyncio. It is tested against SQLAlchemy.\n\n## Purpose\n\nSynchronous drivers such as `psycopg2` support large objects via a file-like interface utilizing libpg. Unfortunately, async drivers such as `asyncpg` do not support large objects at all (at this time). This module is designed to provide an interface for PostgreSQL large objects that is compatible with asyncio and async database drivers. This interface is achieved by calling the supporting PostgreSQL stored functions that are an interface to the large object store.\n\nThis interface is designed to operate on data by reading and writing in chunks so as to stream the data to and from the large object store.\n\nThis codebase is not tied or affiliated with asyncpg. It does utilize SQLAlchemy's AsyncConnection class in its typing.\n\n## PostgreSQL Large Objects\n\nA large object is a special store in PostgreSQL that operates similarly to a file. The large object itself is the record. The database data type is `bytea` (`bytes` in Python). The interface operates as a read and write to the allocated large object via read and write functions. (There are more, but they are out of scope for this project.) The data type of the large object id is `oid`. The tables used by PostgreSQL are `pg_largeobject` and `pg_largeobject_metadata`. The `pg_largeobject` table holds the data itself. The `pg_largeobject_metadata` has a link to the owner oid. These two tables are linked by the `oid` reference.\n\nWhen assiciating the large object to a table record, add an `oid` type column to hold the allocated large object `oid` value from a created large object.\n\nSee the PostgreSQL documentation [here](https://www.postgresql.org/docs/current/largeobjects.html).\n\n## Utilization\n\n### Explicit Create\n\n```python\nfrom asyncpg_lostream.lostream import PGLargeObject\n\n# It is the responsibility of the caller to resolve how an\n# AsyncEngine is created and how an AsyncConnection is created.\n\n# Create a large object\nlob_oid = await PGLargeObject.create_large_object(db)\n\n# Open a large object for read and write\npgl = PGLargeObject(async_connection, lob_oid, mode=\"rw\")\n\nwith open(\"my_data.txt\", \"r\") as data:\n    for buff in data.read(pgl.chunk_size):\n        await pgl.write(buff.encode)\n\npgl.close()\n```\n\n### Context Manager Create\n\n```python\nfrom asyncpg_lostream.lostream import PGLargeObject\n\n# It is the responsibility of the caller to resolve how an\n# AsyncEngine is created and how an AsyncConnection is created.\n\nwith open(\"my_data.txt\", \"r\") as data:\n    async with PGLargeObject(async_connection, 0, \"w\") as pgl:\n        for buff in data.read(pgl.chunk_size):\n            await pgl.write(buff.encode())\n```\n\n### Context manager read\n\n```python\nfrom asyncpg_lostream.lostream import PGLargeObject\n\n# It is the responsibility of the caller to resolve how an\n# AsyncEngine is created and how an AsyncConnection is created.\n\nasync with PGLargeObject(async_connection, existing_lob_oid, \"r\") as pgl:\n    async for buff in pgl:\n        print(buff.decode())\n```\n\n## Development\n\n### Environment\n\n1. Create a virtual environment\n    ```bash\n    python -m venv /path/to/venv\n    ```\n2. Activate the virtual environment\n    ```bash\n    source /path/to/venv/bin/activate\n    ```\n3. Ensure pip is up to date\n    ```bash\n    pip install --upgrade pip\n    ```\n4. Install packages from `requirements.txt`\n    ```bash\n    pip install -r requirements.txt\n    ```\n5. Install `pre-config`\n   ```bash\n   pre-config install\n   ```\n\n### Development\n\n`make`, `docker-compose` and `docker` are required for development.\n\nTo list the make targets, use `make help`\n\nTo start a local PostgreSQL 13 container: `make start_db`\n\nTo shutdown the local PostgreSQL 13 container: `make stop_db`\n\nAfter making changes, create your unit tests in the `asyncpg-lostream/tests` directory.\n\nTest your changes with the command `make test`\n\n## Packaging\n\nIf you intend to make a releaase, change the version in the `setup.cfg` file. This value will be copied to the module's `__version__` file.\n\nBuild the package using `make build`. This will run the tests and then build the artifacts. These will be put into the `dist/` directory.\n\nFor instructions on upload to PyPI, see the [Packaging Python Projects Dcoumentation](https://packaging.python.org/en/latest/tutorials/packaging-projects/#uploading-the-distribution-archives)\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "CRUD on PostgreSQL large objects using async drivers and asyncio. Data are read and written in chunks.",
    "version": "0.1.2",
    "split_keywords": [
        "asyncio",
        "asyncpg",
        "sqlalchemy"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "md5": "673a20e006d6e54a081ef233bdf341ad",
                "sha256": "9db9e2d1652b5c63032634ae97c9cbc1574a606e09af7679d8d0a57e3b7218d8"
            },
            "downloads": -1,
            "filename": "asyncpg_lostream-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "673a20e006d6e54a081ef233bdf341ad",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 10933,
            "upload_time": "2022-12-06T05:18:43",
            "upload_time_iso_8601": "2022-12-06T05:18:43.886001Z",
            "url": "https://files.pythonhosted.org/packages/fc/0e/252c207751dc1b4b78a51ec3da366c35ca6d17979da20049aa79f229d885/asyncpg_lostream-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "md5": "b6a674f71ceb814df57db35b869aa030",
                "sha256": "fd911c1bb4e843b048bfaf87890d41a58167eef69385b9719f478472052723ee"
            },
            "downloads": -1,
            "filename": "asyncpg-lostream-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "b6a674f71ceb814df57db35b869aa030",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 10636,
            "upload_time": "2022-12-06T05:18:45",
            "upload_time_iso_8601": "2022-12-06T05:18:45.930636Z",
            "url": "https://files.pythonhosted.org/packages/93/17/4dbab3eec41efde274b7903abad9bdea2db144fe77a2eeb370e805d62003/asyncpg-lostream-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2022-12-06 05:18:45",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "Red-HAP",
    "github_project": "asyncpg-lostream",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "anyio",
            "specs": []
        },
        {
            "name": "asyncio",
            "specs": []
        },
        {
            "name": "faker",
            "specs": []
        },
        {
            "name": "pre-commit",
            "specs": []
        },
        {
            "name": "asyncpg",
            "specs": []
        },
        {
            "name": "psycopg2-binary",
            "specs": []
        },
        {
            "name": "sqlalchemy",
            "specs": [
                [
                    ">=",
                    "1.4"
                ]
            ]
        },
        {
            "name": "pytest",
            "specs": []
        },
        {
            "name": "pytest-asyncio",
            "specs": []
        },
        {
            "name": "twine",
            "specs": []
        },
        {
            "name": "build",
            "specs": []
        },
        {
            "name": "wheel",
            "specs": []
        },
        {
            "name": "configparser",
            "specs": []
        },
        {
            "name": "packaging",
            "specs": []
        }
    ],
    "lcname": "asyncpg-lostream"
}
        
Elapsed time: 0.04234s