s3streamer


Names3streamer JSON
Version 3.1.0 PyPI version JSON
download
home_pagehttps://gitlab.com/fer1035_python/modules/pypi-s3streamer
SummaryStream files to AWS S3 using multipart upload.
upload_time2025-07-24 08:27:59
maintainerNone
docs_urlNone
authorAhmad Ferdaus Abd Razak
requires_python<4.0,>=3.9
licenseGPL-2.0-only
keywords s3 aws api multipart upload
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ==============
**s3streamer**
==============

Overview
--------

Stream files to AWS S3 using multipart upload.

.. image:: https://gitlab.com/fer1035_python/modules/pypi-s3streamer/-/raw/main/S3Streamer.png
   :width: 400
   :alt: Flowchart

A frontend module to upload files to AWS S3 storage. The module supports large files as it chunks them into smaller sizes and recombines them into the original file in the specified S3 bucket. It employs multiprocessing, and there is the option of specifying the size of each chunk as well as how many chunks to send in a single run. The defaults are listed in **Optional Arguments** below.  

The solution provides a dashboard in `CloudWatch <https://console.aws.amazon.com/cloudwatch/home#dashboards/>`_ to monitor file operations. You may also need to manually deploy the API in `API Gateway <https://console.aws.amazon.com/apigateway/>`_ after deployment and changes.

Prerequisites
-------------

- An AWS S3 bucket to receive uploads.
- An AWS Lambda function to perform backend tasks.
- The AWS `CloudFormation template <https://gitlab.com/fer1035_python/modules/pypi-s3streamer/-/tree/main/cloudformation/s3streamer.yaml>`_ to create these resources is available, or login to your AWS account and click on this `quick link <https://console.aws.amazon.com/cloudformation/home?#/stacks/create/review?templateURL=https://warpedlenses-public.s3.ap-southeast-1.amazonaws.com/cloudformation/s3streamer.yaml>`_.
- The endpoint URL and API key will be created by `CloudFormation <https://console.aws.amazon.com/cloudformation/>`_. They can be found in the stack's **Outputs** section.

Required Arguments
------------------

- file_name (local full / relative path to the file)

Optional Arguments
------------------

- path: Destination path in the S3 bucket (default: **""** for the root of the bucket)
- request_url: URL of the API endpoint (default: **None**)
- request_api_key: API key for the endpoint (default: **None**)
- parts: Number of multiprocessing parts to send simultaneously (default: **10**)
- part_size: Size of each part in MB (default: **100**)
- tmp_path: Location of local temporary directory to store temporary files created by the module (default: **"/tmp"**)
- purge: Whether to purge the specified file instead of uploading it (default: **False**)
- force: Whether to force the upload even if the file already exists in the S3 bucket (default: **False**)

Usage
-----

Installation in BASH:

.. code-block:: BASH

   pip3 install s3streamer
   # or
   python3 -m pip install s3streamer

In Python3:

- To upload a file to S3:

   .. code-block:: PYTHON

      import s3streamer

      if __name__ == "__main__":
         response = s3streamer.stream(
            "myfile.iso",
            path="",
            request_url="https://s3streamer.api.example.com/upload",
            request_api_key="my-api-key",
            parts=5,
            part_size=30,
            tmp_path="/Users/me/Desktop",
            purge=False,
            force=False
         )
         print(response)

- To remove a file from S3:

   .. code-block:: PYTHON

      import s3streamer

      if __name__ == "__main__":
         response = s3streamer.stream(
            "myfile.iso",
            path="",
            request_url="https://s3streamer.api.example.com/upload",
            request_api_key="my-api-key",
            purge=True
         )
         print(response)

To simplify operations, the endpoint and API key can also be set as environment variables:

.. code-block:: BASH

   export S3STREAMER_ENDPOINT="https://s3streamer.api.example.com/upload"
   export S3STREAMER_API_KEY="my-api-key"

By doing so, the upload command can be simplified to:

.. code-block:: PYTHON

   import s3streamer

   if __name__ == "__main__":
      response = s3streamer.stream("myfile.iso")
      print(response)

with default values for the optional (keyword) arguments. Or you can use the included CLI tool (all arguments are applicable as shown in the help section below):

.. code-block:: BASH

   streams3 -f myfile.iso

Help can be accessed by running:

.. code-block:: BASH

   streams3 --help

   usage: s3streamer [options]

   Stream files to AWS S3 using multipart upload.

   options:
      -h, --help            show this help message and exit
      -V, --version         show program's version number and exit
      -f, --local_file_path [LOCAL_FILE_PATH]
                              Local file path to upload or purge.
      -y, --tgw_only        Transit Gateway details only, no other components.
      -r, --remote_file_path [REMOTE_FILE_PATH]
                              Remote file path in S3 bucket. Default: empty string
                              for root folder.
      -u, --request_url [REQUEST_URL]
                              S3Streamer API endpoint URL. Default:
                              S3STREAMER_ENDPOINT environment variable.
      -k, --request_api_key [REQUEST_API_KEY]
                              S3Streamer API key. Default: S3STREAMER_API_KEY
                              environment variable.
      -p, --parts [PARTS]   Number of parts to upload in parallel. Default: 10.
      -s, --part_size [PART_SIZE]
                              Size of each part in MB. Default: 100.
      -t, --tmp_path [TMP_PATH]
                              Temporary path for chunked files. Default: /tmp.
      -d, --purge [PURGE]   Purge file from S3 storage. Default: False.
      -o, --force [FORCE]   Force upload even if file already exists. Default:
                              False.

If the upload is successful, the file will be available at **installer/images/myfile.iso**.

            

Raw data

            {
    "_id": null,
    "home_page": "https://gitlab.com/fer1035_python/modules/pypi-s3streamer",
    "name": "s3streamer",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "S3, AWS, API, multipart, upload",
    "author": "Ahmad Ferdaus Abd Razak",
    "author_email": "fer1035@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/11/72/5e6737f64bf15ac2447a400a8f3f5f1a0c363822ad06fb868dcfe9ec17bc/s3streamer-3.1.0.tar.gz",
    "platform": null,
    "description": "==============\n**s3streamer**\n==============\n\nOverview\n--------\n\nStream files to AWS S3 using multipart upload.\n\n.. image:: https://gitlab.com/fer1035_python/modules/pypi-s3streamer/-/raw/main/S3Streamer.png\n   :width: 400\n   :alt: Flowchart\n\nA frontend module to upload files to AWS S3 storage. The module supports large files as it chunks them into smaller sizes and recombines them into the original file in the specified S3 bucket. It employs multiprocessing, and there is the option of specifying the size of each chunk as well as how many chunks to send in a single run. The defaults are listed in **Optional Arguments** below.  \n\nThe solution provides a dashboard in `CloudWatch <https://console.aws.amazon.com/cloudwatch/home#dashboards/>`_ to monitor file operations. You may also need to manually deploy the API in `API Gateway <https://console.aws.amazon.com/apigateway/>`_ after deployment and changes.\n\nPrerequisites\n-------------\n\n- An AWS S3 bucket to receive uploads.\n- An AWS Lambda function to perform backend tasks.\n- The AWS `CloudFormation template <https://gitlab.com/fer1035_python/modules/pypi-s3streamer/-/tree/main/cloudformation/s3streamer.yaml>`_ to create these resources is available, or login to your AWS account and click on this `quick link <https://console.aws.amazon.com/cloudformation/home?#/stacks/create/review?templateURL=https://warpedlenses-public.s3.ap-southeast-1.amazonaws.com/cloudformation/s3streamer.yaml>`_.\n- The endpoint URL and API key will be created by `CloudFormation <https://console.aws.amazon.com/cloudformation/>`_. They can be found in the stack's **Outputs** section.\n\nRequired Arguments\n------------------\n\n- file_name (local full / relative path to the file)\n\nOptional Arguments\n------------------\n\n- path: Destination path in the S3 bucket (default: **\"\"** for the root of the bucket)\n- request_url: URL of the API endpoint (default: **None**)\n- request_api_key: API key for the endpoint (default: **None**)\n- parts: Number of multiprocessing parts to send simultaneously (default: **10**)\n- part_size: Size of each part in MB (default: **100**)\n- tmp_path: Location of local temporary directory to store temporary files created by the module (default: **\"/tmp\"**)\n- purge: Whether to purge the specified file instead of uploading it (default: **False**)\n- force: Whether to force the upload even if the file already exists in the S3 bucket (default: **False**)\n\nUsage\n-----\n\nInstallation in BASH:\n\n.. code-block:: BASH\n\n   pip3 install s3streamer\n   # or\n   python3 -m pip install s3streamer\n\nIn Python3:\n\n- To upload a file to S3:\n\n   .. code-block:: PYTHON\n\n      import s3streamer\n\n      if __name__ == \"__main__\":\n         response = s3streamer.stream(\n            \"myfile.iso\",\n            path=\"\",\n            request_url=\"https://s3streamer.api.example.com/upload\",\n            request_api_key=\"my-api-key\",\n            parts=5,\n            part_size=30,\n            tmp_path=\"/Users/me/Desktop\",\n            purge=False,\n            force=False\n         )\n         print(response)\n\n- To remove a file from S3:\n\n   .. code-block:: PYTHON\n\n      import s3streamer\n\n      if __name__ == \"__main__\":\n         response = s3streamer.stream(\n            \"myfile.iso\",\n            path=\"\",\n            request_url=\"https://s3streamer.api.example.com/upload\",\n            request_api_key=\"my-api-key\",\n            purge=True\n         )\n         print(response)\n\nTo simplify operations, the endpoint and API key can also be set as environment variables:\n\n.. code-block:: BASH\n\n   export S3STREAMER_ENDPOINT=\"https://s3streamer.api.example.com/upload\"\n   export S3STREAMER_API_KEY=\"my-api-key\"\n\nBy doing so, the upload command can be simplified to:\n\n.. code-block:: PYTHON\n\n   import s3streamer\n\n   if __name__ == \"__main__\":\n      response = s3streamer.stream(\"myfile.iso\")\n      print(response)\n\nwith default values for the optional (keyword) arguments. Or you can use the included CLI tool (all arguments are applicable as shown in the help section below):\n\n.. code-block:: BASH\n\n   streams3 -f myfile.iso\n\nHelp can be accessed by running:\n\n.. code-block:: BASH\n\n   streams3 --help\n\n   usage: s3streamer [options]\n\n   Stream files to AWS S3 using multipart upload.\n\n   options:\n      -h, --help            show this help message and exit\n      -V, --version         show program's version number and exit\n      -f, --local_file_path [LOCAL_FILE_PATH]\n                              Local file path to upload or purge.\n      -y, --tgw_only        Transit Gateway details only, no other components.\n      -r, --remote_file_path [REMOTE_FILE_PATH]\n                              Remote file path in S3 bucket. Default: empty string\n                              for root folder.\n      -u, --request_url [REQUEST_URL]\n                              S3Streamer API endpoint URL. Default:\n                              S3STREAMER_ENDPOINT environment variable.\n      -k, --request_api_key [REQUEST_API_KEY]\n                              S3Streamer API key. Default: S3STREAMER_API_KEY\n                              environment variable.\n      -p, --parts [PARTS]   Number of parts to upload in parallel. Default: 10.\n      -s, --part_size [PART_SIZE]\n                              Size of each part in MB. Default: 100.\n      -t, --tmp_path [TMP_PATH]\n                              Temporary path for chunked files. Default: /tmp.\n      -d, --purge [PURGE]   Purge file from S3 storage. Default: False.\n      -o, --force [FORCE]   Force upload even if file already exists. Default:\n                              False.\n\nIf the upload is successful, the file will be available at **installer/images/myfile.iso**.\n",
    "bugtrack_url": null,
    "license": "GPL-2.0-only",
    "summary": "Stream files to AWS S3 using multipart upload.",
    "version": "3.1.0",
    "project_urls": {
        "Homepage": "https://gitlab.com/fer1035_python/modules/pypi-s3streamer",
        "Repository": "https://gitlab.com/fer1035_python/modules/pypi-s3streamer"
    },
    "split_keywords": [
        "s3",
        " aws",
        " api",
        " multipart",
        " upload"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "928d29b8b6d88076320a3cf27d6dca44c654cc8a8f5713d4fc1481cb9c997d24",
                "md5": "f23c587ada98e4107ec5c255668276d9",
                "sha256": "ae629457b380620a9d04ed0342ba7b61634d40f2c4a88d47d1e109efc49c436a"
            },
            "downloads": -1,
            "filename": "s3streamer-3.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f23c587ada98e4107ec5c255668276d9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 8222,
            "upload_time": "2025-07-24T08:27:58",
            "upload_time_iso_8601": "2025-07-24T08:27:58.057337Z",
            "url": "https://files.pythonhosted.org/packages/92/8d/29b8b6d88076320a3cf27d6dca44c654cc8a8f5713d4fc1481cb9c997d24/s3streamer-3.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "11725e6737f64bf15ac2447a400a8f3f5f1a0c363822ad06fb868dcfe9ec17bc",
                "md5": "1a04796910c1ff199beb07e644662337",
                "sha256": "414caae4a3cd78f1cedbd0b20b3b357bdc179edb4a892ee61ed845568afb5d57"
            },
            "downloads": -1,
            "filename": "s3streamer-3.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "1a04796910c1ff199beb07e644662337",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 8485,
            "upload_time": "2025-07-24T08:27:59",
            "upload_time_iso_8601": "2025-07-24T08:27:59.235542Z",
            "url": "https://files.pythonhosted.org/packages/11/72/5e6737f64bf15ac2447a400a8f3f5f1a0c363822ad06fb868dcfe9ec17bc/s3streamer-3.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-24 08:27:59",
    "github": false,
    "gitlab": true,
    "bitbucket": false,
    "codeberg": false,
    "gitlab_user": "fer1035_python",
    "gitlab_project": "modules",
    "lcname": "s3streamer"
}
        
Elapsed time: 2.03996s