fs-s3fs-forked


Namefs-s3fs-forked JSON
Version 1.1.3 PyPI version JSON
download
home_pagehttps://gitlab.com/geovisio/infra/s3fs
SummaryAmazon S3 filesystem for PyFilesystem2, forked from https://github.com/PyFilesystem/s3fs
upload_time2023-06-19 07:58:46
maintainer
docs_urlNone
authorAntoine Desbordes
requires_python
licenseMIT
keywords pyfilesystem amazon s3
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            Forked from https://github.com/PyFilesystem/s3fs/
=================================================

-  to be able to set the endpoint in the url
-  to be able to skip to directory creation/removal to improve
   performance

S3FS
====

S3FS is a `PyFilesystem <https://www.pyfilesystem.org/>`__ interface to
Amazon S3 cloud storage.

As a PyFilesystem concrete class,
`S3FS <http://fs-s3fs.readthedocs.io/en/latest/>`__ allows you to work
with S3 in the same way as any other supported filesystem.

Installing
----------

You can install S3FS from pip as follows:

::

   pip install fs-s3fs-forked

Opening a S3FS
--------------

Open an S3FS by explicitly using the constructor:

.. code:: python

   from fs_s3fs import S3FS
   s3fs = S3FS('mybucket')

Or with a FS URL:

.. code:: python

     from fs import open_fs
     s3fs = open_fs('s3://mybucket')

Downloading Files
-----------------

To *download* files from an S3 bucket, open a file on the S3 filesystem
for reading, then write the data to a file on the local filesystem.
Here’s an example that copies a file ``example.mov`` from S3 to your HD:

.. code:: python

   from fs.tools import copy_file_data
   with s3fs.open('example.mov', 'rb') as remote_file:
       with open('example.mov', 'wb') as local_file:
           copy_file_data(remote_file, local_file)

Although it is preferable to use the higher-level functionality in the
``fs.copy`` module. Here’s an example:

.. code:: python

   from fs.copy import copy_file
   copy_file(s3fs, 'example.mov', './', 'example.mov')

Uploading Files
---------------

You can *upload* files in the same way. Simply copy a file from a source
filesystem to the S3 filesystem. See `Moving and
Copying <https://docs.pyfilesystem.org/en/latest/guide.html#moving-and-copying>`__
for more information.

ExtraArgs
---------

S3 objects have additional properties, beyond a traditional filesystem.
These options can be set using the ``upload_args`` and ``download_args``
properties. which are handed to upload and download methods, as
appropriate, for the lifetime of the filesystem instance.

For example, to set the ``cache-control`` header of all objects uploaded
to a bucket:

.. code:: python

   import fs, fs.mirror
   s3fs = S3FS('example', upload_args={"CacheControl": "max-age=2592000", "ACL": "public-read"})
   fs.mirror.mirror('/path/to/mirror', s3fs)

see `the Boto3
docs <https://boto3.readthedocs.io/en/latest/reference/customizations/s3.html#boto3.s3.transfer.S3Transfer.ALLOWED_UPLOAD_ARGS>`__
for more information.

``acl`` and ``cache_control`` are exposed explicitly for convenience,
and can be used in URLs. It is important to URL-Escape the
``cache_control`` value in a URL, as it may contain special characters.

.. code:: python

   import fs, fs.mirror
   with open fs.open_fs('s3://example?acl=public-read&cache_control=max-age%3D2592000%2Cpublic') as s3fs
       fs.mirror.mirror('/path/to/mirror', s3fs)

S3 URLs
-------

You can get a public URL to a file on a S3 bucket as follows:

.. code:: python

   movie_url = s3fs.geturl('example.mov')

Documentation
-------------

-  `PyFilesystem Wiki <https://www.pyfilesystem.org>`__
-  `S3FS Reference <http://fs-s3fs.readthedocs.io/en/latest/>`__
-  `PyFilesystem
   Reference <https://docs.pyfilesystem.org/en/latest/reference/base.html>`__

Releasing
---------

-  Update version number in \_version.py
-  install build dependencies: ``pip install wheel twine``
-  install pandoc
-  ``make release``

            

Raw data

            {
    "_id": null,
    "home_page": "https://gitlab.com/geovisio/infra/s3fs",
    "name": "fs-s3fs-forked",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "pyfilesystem,Amazon,s3",
    "author": "Antoine Desbordes",
    "author_email": "antoine.desbordes@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/f2/a1/ec175055d1ae895189be5abfba786cf20f73bc3647e5e8bd6e2ba8db17f4/fs-s3fs-forked-1.1.3.tar.gz",
    "platform": "any",
    "description": "Forked from https://github.com/PyFilesystem/s3fs/\n=================================================\n\n-  to be able to set the endpoint in the url\n-  to be able to skip to directory creation/removal to improve\n   performance\n\nS3FS\n====\n\nS3FS is a `PyFilesystem <https://www.pyfilesystem.org/>`__ interface to\nAmazon S3 cloud storage.\n\nAs a PyFilesystem concrete class,\n`S3FS <http://fs-s3fs.readthedocs.io/en/latest/>`__ allows you to work\nwith S3 in the same way as any other supported filesystem.\n\nInstalling\n----------\n\nYou can install S3FS from pip as follows:\n\n::\n\n   pip install fs-s3fs-forked\n\nOpening a S3FS\n--------------\n\nOpen an S3FS by explicitly using the constructor:\n\n.. code:: python\n\n   from fs_s3fs import S3FS\n   s3fs = S3FS('mybucket')\n\nOr with a FS URL:\n\n.. code:: python\n\n     from fs import open_fs\n     s3fs = open_fs('s3://mybucket')\n\nDownloading Files\n-----------------\n\nTo *download* files from an S3 bucket, open a file on the S3 filesystem\nfor reading, then write the data to a file on the local filesystem.\nHere\u2019s an example that copies a file ``example.mov`` from S3 to your HD:\n\n.. code:: python\n\n   from fs.tools import copy_file_data\n   with s3fs.open('example.mov', 'rb') as remote_file:\n       with open('example.mov', 'wb') as local_file:\n           copy_file_data(remote_file, local_file)\n\nAlthough it is preferable to use the higher-level functionality in the\n``fs.copy`` module. Here\u2019s an example:\n\n.. code:: python\n\n   from fs.copy import copy_file\n   copy_file(s3fs, 'example.mov', './', 'example.mov')\n\nUploading Files\n---------------\n\nYou can *upload* files in the same way. Simply copy a file from a source\nfilesystem to the S3 filesystem. See `Moving and\nCopying <https://docs.pyfilesystem.org/en/latest/guide.html#moving-and-copying>`__\nfor more information.\n\nExtraArgs\n---------\n\nS3 objects have additional properties, beyond a traditional filesystem.\nThese options can be set using the ``upload_args`` and ``download_args``\nproperties. which are handed to upload and download methods, as\nappropriate, for the lifetime of the filesystem instance.\n\nFor example, to set the ``cache-control`` header of all objects uploaded\nto a bucket:\n\n.. code:: python\n\n   import fs, fs.mirror\n   s3fs = S3FS('example', upload_args={\"CacheControl\": \"max-age=2592000\", \"ACL\": \"public-read\"})\n   fs.mirror.mirror('/path/to/mirror', s3fs)\n\nsee `the Boto3\ndocs <https://boto3.readthedocs.io/en/latest/reference/customizations/s3.html#boto3.s3.transfer.S3Transfer.ALLOWED_UPLOAD_ARGS>`__\nfor more information.\n\n``acl`` and ``cache_control`` are exposed explicitly for convenience,\nand can be used in URLs. It is important to URL-Escape the\n``cache_control`` value in a URL, as it may contain special characters.\n\n.. code:: python\n\n   import fs, fs.mirror\n   with open fs.open_fs('s3://example?acl=public-read&cache_control=max-age%3D2592000%2Cpublic') as s3fs\n       fs.mirror.mirror('/path/to/mirror', s3fs)\n\nS3 URLs\n-------\n\nYou can get a public URL to a file on a S3 bucket as follows:\n\n.. code:: python\n\n   movie_url = s3fs.geturl('example.mov')\n\nDocumentation\n-------------\n\n-  `PyFilesystem Wiki <https://www.pyfilesystem.org>`__\n-  `S3FS Reference <http://fs-s3fs.readthedocs.io/en/latest/>`__\n-  `PyFilesystem\n   Reference <https://docs.pyfilesystem.org/en/latest/reference/base.html>`__\n\nReleasing\n---------\n\n-  Update version number in \\_version.py\n-  install build dependencies: ``pip install wheel twine``\n-  install pandoc\n-  ``make release``\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Amazon S3 filesystem for PyFilesystem2, forked from https://github.com/PyFilesystem/s3fs",
    "version": "1.1.3",
    "project_urls": {
        "Homepage": "https://gitlab.com/geovisio/infra/s3fs"
    },
    "split_keywords": [
        "pyfilesystem",
        "amazon",
        "s3"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "daefcbd1bf97510c0d7d4dedfc9c63869d28b5f16c458e425206442835134a17",
                "md5": "d5a4d90c790820df1032e2114c76d508",
                "sha256": "9be6cb3727577fb408d8368320b4f17aa0b5fe16686f968a8d03ca9d44b0da5e"
            },
            "downloads": -1,
            "filename": "fs_s3fs_forked-1.1.3-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d5a4d90c790820df1032e2114c76d508",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": null,
            "size": 10968,
            "upload_time": "2023-06-19T07:58:44",
            "upload_time_iso_8601": "2023-06-19T07:58:44.474808Z",
            "url": "https://files.pythonhosted.org/packages/da/ef/cbd1bf97510c0d7d4dedfc9c63869d28b5f16c458e425206442835134a17/fs_s3fs_forked-1.1.3-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f2a1ec175055d1ae895189be5abfba786cf20f73bc3647e5e8bd6e2ba8db17f4",
                "md5": "5799efda5ffa45d22e80022a7db74249",
                "sha256": "4dd557ed1397c7c1b4d07cb3e6b4fb299dbcd6bbf4459af425f175012234c589"
            },
            "downloads": -1,
            "filename": "fs-s3fs-forked-1.1.3.tar.gz",
            "has_sig": false,
            "md5_digest": "5799efda5ffa45d22e80022a7db74249",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 11720,
            "upload_time": "2023-06-19T07:58:46",
            "upload_time_iso_8601": "2023-06-19T07:58:46.523210Z",
            "url": "https://files.pythonhosted.org/packages/f2/a1/ec175055d1ae895189be5abfba786cf20f73bc3647e5e8bd6e2ba8db17f4/fs-s3fs-forked-1.1.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-19 07:58:46",
    "github": false,
    "gitlab": true,
    "bitbucket": false,
    "codeberg": false,
    "gitlab_user": "geovisio",
    "gitlab_project": "infra",
    "lcname": "fs-s3fs-forked"
}
        
Elapsed time: 0.34646s