gzip-stream


Namegzip-stream JSON
Version 1.2.0 PyPI version JSON
download
home_pagehttps://github.com/leenr/gzip-stream
SummaryCompress stream by GZIP on the fly.
upload_time2021-12-08 15:47:19
maintainerleenr
docs_urlNone
authorleenr
requires_python~=3.5
license
keywords gzip compression
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ===========
gzip-stream
===========

`gzip-stream` is a super-tiny library that will help you compress by GZIP
on-the-fly.

`GZIPCompressedStream` class instance acting like an any other stream (in fact,
`GZIPCompressedStream` inherits `io.RawIOBase <https://docs.python.org/3/library/io.html#io.RawIOBase>`_),
but wraps another stream and compress it on-the-fly.

.. code-block:: python

    from gzip_stream import GZIPCompressedStream
    from my_upload_lib import MyUploadClient

    upload_client = MyUploadClient()
    with open('my_very_big_1tb_file.txt') as file_to_upload:
        compressed_stream = GZIPCompressedStream(
            file_to_upload,
            compression_level=7
        )
        upload_client.upload_fileobj(compressed_stream)

`GZIPCompressedStream` does not read entire stream, but instead read it
by chunks, until compressed output size will not satisfy read size.

`AsyncGZIPDecompressedStream` class can async read from another source
with zlib and gzip decompression on-the-fly

.. code-block:: python

    # aiobotocore example

    import aiobotocore

    from gzip_stream import AsyncGZIPDecompressedStream

    AWS_ACCESS_KEY_ID = "KEY_ID"
    AWS_SECRET_ACCESS_KEY = "ACCESS_KEY"
    BUCKET = "AWESOME_BUCKET"

    upload_client = MyAsyncUploadClient()
    session = aiobotocore.get_session()
    async with session.create_client(
        service_name="s3",
        endpoint_url="s3_endpoint",
        aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
        aws_access_key_id=AWS_ACCESS_KEY_ID,
    ) as client:
        response = await client.get_object(Bucket=BUCKET, Key='my_very_big_1tb_file.txt.gz')
        async for decompressed_chunk in GzipAsyncReaderWrapper(response["Body"])):
            await upload_client.upload_fileobj(decompressed_chunk)


Module works on Python ~= 3.5.

Installation
------------
.. code-block:: bash

    pip install gzip-stream


License
-------
Public Domain: `CC0 1.0 Universal <https://creativecommons.org/publicdomain/zero/1.0/>`_.



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/leenr/gzip-stream",
    "name": "gzip-stream",
    "maintainer": "leenr",
    "docs_url": null,
    "requires_python": "~=3.5",
    "maintainer_email": "i@leenr.me",
    "keywords": "gzip,compression",
    "author": "leenr",
    "author_email": "i@leenr.me",
    "download_url": "https://files.pythonhosted.org/packages/15/9f/20fb069117a2f1ea514878f51db1da9046109cf76ba38577316bc1964d28/gzip-stream-1.2.0.tar.gz",
    "platform": "posix",
    "description": "===========\ngzip-stream\n===========\n\n`gzip-stream` is a super-tiny library that will help you compress by GZIP\non-the-fly.\n\n`GZIPCompressedStream` class instance acting like an any other stream (in fact,\n`GZIPCompressedStream` inherits `io.RawIOBase <https://docs.python.org/3/library/io.html#io.RawIOBase>`_),\nbut wraps another stream and compress it on-the-fly.\n\n.. code-block:: python\n\n    from gzip_stream import GZIPCompressedStream\n    from my_upload_lib import MyUploadClient\n\n    upload_client = MyUploadClient()\n    with open('my_very_big_1tb_file.txt') as file_to_upload:\n        compressed_stream = GZIPCompressedStream(\n            file_to_upload,\n            compression_level=7\n        )\n        upload_client.upload_fileobj(compressed_stream)\n\n`GZIPCompressedStream` does not read entire stream, but instead read it\nby chunks, until compressed output size will not satisfy read size.\n\n`AsyncGZIPDecompressedStream` class can async read from another source\nwith zlib and gzip decompression on-the-fly\n\n.. code-block:: python\n\n    # aiobotocore example\n\n    import aiobotocore\n\n    from gzip_stream import AsyncGZIPDecompressedStream\n\n    AWS_ACCESS_KEY_ID = \"KEY_ID\"\n    AWS_SECRET_ACCESS_KEY = \"ACCESS_KEY\"\n    BUCKET = \"AWESOME_BUCKET\"\n\n    upload_client = MyAsyncUploadClient()\n    session = aiobotocore.get_session()\n    async with session.create_client(\n        service_name=\"s3\",\n        endpoint_url=\"s3_endpoint\",\n        aws_secret_access_key=AWS_SECRET_ACCESS_KEY,\n        aws_access_key_id=AWS_ACCESS_KEY_ID,\n    ) as client:\n        response = await client.get_object(Bucket=BUCKET, Key='my_very_big_1tb_file.txt.gz')\n        async for decompressed_chunk in GzipAsyncReaderWrapper(response[\"Body\"])):\n            await upload_client.upload_fileobj(decompressed_chunk)\n\n\nModule works on Python ~= 3.5.\n\nInstallation\n------------\n.. code-block:: bash\n\n    pip install gzip-stream\n\n\nLicense\n-------\nPublic Domain: `CC0 1.0 Universal <https://creativecommons.org/publicdomain/zero/1.0/>`_.\n\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Compress stream by GZIP on the fly.",
    "version": "1.2.0",
    "project_urls": {
        "Homepage": "https://github.com/leenr/gzip-stream"
    },
    "split_keywords": [
        "gzip",
        "compression"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "159f20fb069117a2f1ea514878f51db1da9046109cf76ba38577316bc1964d28",
                "md5": "2a93443ece6d14f35fb4b574b07dfa5f",
                "sha256": "4065cd84443de771fe8dfa6463140cbd5d02da83c06381c84a8ac66cfc5f9766"
            },
            "downloads": -1,
            "filename": "gzip-stream-1.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "2a93443ece6d14f35fb4b574b07dfa5f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "~=3.5",
            "size": 6146,
            "upload_time": "2021-12-08T15:47:19",
            "upload_time_iso_8601": "2021-12-08T15:47:19.181253Z",
            "url": "https://files.pythonhosted.org/packages/15/9f/20fb069117a2f1ea514878f51db1da9046109cf76ba38577316bc1964d28/gzip-stream-1.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2021-12-08 15:47:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "leenr",
    "github_project": "gzip-stream",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "tox": true,
    "lcname": "gzip-stream"
}
        
Elapsed time: 0.40861s