enhanced-synapse-s3-provider


Nameenhanced-synapse-s3-provider JSON
Version 1.2.50 PyPI version JSON
download
home_pagehttps://github.com/elyesbenamor/synapse-s3-storage-provider
SummaryEnhanced S3 storage provider for Synapse with improved cleanup functionality
upload_time2025-01-24 06:56:05
maintainerNone
docs_urlNone
authorelyesbenamor
requires_pythonNone
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            Synapse S3 Storage Provider
===========================

This module can be used by synapse as a storage provider, allowing it to fetch
and store media in Amazon S3.


Usage
-----

The `s3_storage_provider.py` should be on the PYTHONPATH when starting
synapse.

Example of entry in synapse config:

```yaml
media_storage_providers:
- module: s3_storage_provider.S3StorageProviderBackend
  store_local: True
  store_remote: True
  store_synchronous: True
  config:
    bucket: <S3_BUCKET_NAME>
    # All of the below options are optional, for use with non-AWS S3-like
    # services, or to specify access tokens here instead of some external method.
    region_name: <S3_REGION_NAME>
    endpoint_url: <S3_LIKE_SERVICE_ENDPOINT_URL>
    access_key_id: <S3_ACCESS_KEY_ID>
    secret_access_key: <S3_SECRET_ACCESS_KEY>
    session_token: <S3_SESSION_TOKEN>

    # Server Side Encryption for Customer-provided keys
    #sse_customer_key: <S3_SSEC_KEY>
    # Your SSE-C algorithm is very likely AES256
    # Default is AES256.
    #sse_customer_algo: <S3_SSEC_ALGO>

    # The object storage class used when uploading files to the bucket.
    # Default is STANDARD.
    #storage_class: "STANDARD_IA"

    # Prefix for all media in bucket, can't be changed once media has been uploaded
    # Useful if sharing the bucket between Synapses
    # Blank if not provided
    #prefix: "prefix/to/files/in/bucket"

    # The maximum number of concurrent threads which will be used to connect
    # to S3. Each thread manages a single connection. Default is 40.
    #
    #threadpool_size: 20
```

This module uses `boto3`, and so the credentials should be specified as
described [here](https://boto3.readthedocs.io/en/latest/guide/configuration.html#guide-configuration).

Regular cleanup job
-------------------

There is additionally a script at `scripts/s3_media_upload` which can be used
in a regular job to upload content to s3, then delete that from local disk.
This script can be used in combination with configuration for the storage
provider to pull media from s3, but upload it asynchronously.

Once the package is installed, the script should be run somewhat like the
following. We suggest using `tmux` or `screen` as these can take a long time
on larger servers.

`database.yaml` should contain the keys that would be passed to psycopg2 to
connect to your database. They can be found in the contents of the
`database`.`args` parameter in your homeserver.yaml.

More options are available in the command help.

```
> cd s3_media_upload
# cache.db will be created if absent. database.yaml is required to
# contain PG credentials
> ls
cache.db database.yaml
# Update cache from /path/to/media/store looking for files not used
# within 2 months
> s3_media_upload update /path/to/media/store 2m
Syncing files that haven't been accessed since: 2018-10-18 11:06:21.520602
Synced 0 new rows
100%|█████████████████████████████████████████████████████████████| 1074/1074 [00:33<00:00, 25.97files/s]
Updated 0 as deleted

> s3_media_upload upload /path/to/media/store matrix_s3_bucket_name --storage-class STANDARD_IA --delete
# prepare to wait a long time
```

Packaging and release
---------

For maintainers:

1. Update the `__version__` in setup.py. Commit. Push.
2. Create a release on GitHub for this version.
3. When published, a [GitHub action workflow](https://github.com/matrix-org/synapse-s3-storage-provider/actions/workflows/release.yml) will build the package and upload to [PyPI](https://pypi.org/project/synapse-s3-storage-provider/).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/elyesbenamor/synapse-s3-storage-provider",
    "name": "enhanced-synapse-s3-provider",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": null,
    "author": "elyesbenamor",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/1c/3a/7659f4485ed3a4592ccea32c6dea8428cbbde6fb826d469749b4795b6803/enhanced_synapse_s3_provider-1.2.50.tar.gz",
    "platform": null,
    "description": "Synapse S3 Storage Provider\n===========================\n\nThis module can be used by synapse as a storage provider, allowing it to fetch\nand store media in Amazon S3.\n\n\nUsage\n-----\n\nThe `s3_storage_provider.py` should be on the PYTHONPATH when starting\nsynapse.\n\nExample of entry in synapse config:\n\n```yaml\nmedia_storage_providers:\n- module: s3_storage_provider.S3StorageProviderBackend\n  store_local: True\n  store_remote: True\n  store_synchronous: True\n  config:\n    bucket: <S3_BUCKET_NAME>\n    # All of the below options are optional, for use with non-AWS S3-like\n    # services, or to specify access tokens here instead of some external method.\n    region_name: <S3_REGION_NAME>\n    endpoint_url: <S3_LIKE_SERVICE_ENDPOINT_URL>\n    access_key_id: <S3_ACCESS_KEY_ID>\n    secret_access_key: <S3_SECRET_ACCESS_KEY>\n    session_token: <S3_SESSION_TOKEN>\n\n    # Server Side Encryption for Customer-provided keys\n    #sse_customer_key: <S3_SSEC_KEY>\n    # Your SSE-C algorithm is very likely AES256\n    # Default is AES256.\n    #sse_customer_algo: <S3_SSEC_ALGO>\n\n    # The object storage class used when uploading files to the bucket.\n    # Default is STANDARD.\n    #storage_class: \"STANDARD_IA\"\n\n    # Prefix for all media in bucket, can't be changed once media has been uploaded\n    # Useful if sharing the bucket between Synapses\n    # Blank if not provided\n    #prefix: \"prefix/to/files/in/bucket\"\n\n    # The maximum number of concurrent threads which will be used to connect\n    # to S3. Each thread manages a single connection. Default is 40.\n    #\n    #threadpool_size: 20\n```\n\nThis module uses `boto3`, and so the credentials should be specified as\ndescribed [here](https://boto3.readthedocs.io/en/latest/guide/configuration.html#guide-configuration).\n\nRegular cleanup job\n-------------------\n\nThere is additionally a script at `scripts/s3_media_upload` which can be used\nin a regular job to upload content to s3, then delete that from local disk.\nThis script can be used in combination with configuration for the storage\nprovider to pull media from s3, but upload it asynchronously.\n\nOnce the package is installed, the script should be run somewhat like the\nfollowing. We suggest using `tmux` or `screen` as these can take a long time\non larger servers.\n\n`database.yaml` should contain the keys that would be passed to psycopg2 to\nconnect to your database. They can be found in the contents of the\n`database`.`args` parameter in your homeserver.yaml.\n\nMore options are available in the command help.\n\n```\n> cd s3_media_upload\n# cache.db will be created if absent. database.yaml is required to\n# contain PG credentials\n> ls\ncache.db database.yaml\n# Update cache from /path/to/media/store looking for files not used\n# within 2 months\n> s3_media_upload update /path/to/media/store 2m\nSyncing files that haven't been accessed since: 2018-10-18 11:06:21.520602\nSynced 0 new rows\n100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 1074/1074 [00:33<00:00, 25.97files/s]\nUpdated 0 as deleted\n\n> s3_media_upload upload /path/to/media/store matrix_s3_bucket_name --storage-class STANDARD_IA --delete\n# prepare to wait a long time\n```\n\nPackaging and release\n---------\n\nFor maintainers:\n\n1. Update the `__version__` in setup.py. Commit. Push.\n2. Create a release on GitHub for this version.\n3. When published, a [GitHub action workflow](https://github.com/matrix-org/synapse-s3-storage-provider/actions/workflows/release.yml) will build the package and upload to [PyPI](https://pypi.org/project/synapse-s3-storage-provider/).\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Enhanced S3 storage provider for Synapse with improved cleanup functionality",
    "version": "1.2.50",
    "project_urls": {
        "Homepage": "https://github.com/elyesbenamor/synapse-s3-storage-provider"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "620e99b0b768f1c16d515099ba4483baeb6e66915062809178da5208cfd26e07",
                "md5": "cd6b3ab60927fbdd4fcb694af52e4171",
                "sha256": "3e1ae502ca3acc55d6c63fd61fa0074d295a3103624306992524b09ad4ff43db"
            },
            "downloads": -1,
            "filename": "enhanced_synapse_s3_provider-1.2.50-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "cd6b3ab60927fbdd4fcb694af52e4171",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 16838,
            "upload_time": "2025-01-24T06:56:03",
            "upload_time_iso_8601": "2025-01-24T06:56:03.183662Z",
            "url": "https://files.pythonhosted.org/packages/62/0e/99b0b768f1c16d515099ba4483baeb6e66915062809178da5208cfd26e07/enhanced_synapse_s3_provider-1.2.50-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "1c3a7659f4485ed3a4592ccea32c6dea8428cbbde6fb826d469749b4795b6803",
                "md5": "a16e961b6673b777cd9e395a54ec1928",
                "sha256": "3d36d4378c919f91a8cf1e0fc6de3c9ea14741cc9e5e0f359c95ab0705796e54"
            },
            "downloads": -1,
            "filename": "enhanced_synapse_s3_provider-1.2.50.tar.gz",
            "has_sig": false,
            "md5_digest": "a16e961b6673b777cd9e395a54ec1928",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 16465,
            "upload_time": "2025-01-24T06:56:05",
            "upload_time_iso_8601": "2025-01-24T06:56:05.126768Z",
            "url": "https://files.pythonhosted.org/packages/1c/3a/7659f4485ed3a4592ccea32c6dea8428cbbde6fb826d469749b4795b6803/enhanced_synapse_s3_provider-1.2.50.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-01-24 06:56:05",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "elyesbenamor",
    "github_project": "synapse-s3-storage-provider",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "enhanced-synapse-s3-provider"
}
        
Elapsed time: 0.46481s