fs-s3fs


Namefs-s3fs JSON
Version 1.1.1 PyPI version JSON
download
home_pagehttps://github.com/PyFilesystem/s3fs
SummaryAmazon S3 filesystem for PyFilesystem2
upload_time2019-08-14 10:59:30
maintainer
docs_urlNone
authorWill McGugan
requires_python
licenseMIT
keywords pyfilesystem
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            S3FS
====

S3FS is a `PyFilesystem <https://www.pyfilesystem.org/>`__ interface to
Amazon S3 cloud storage.

As a PyFilesystem concrete class,
`S3FS <http://fs-s3fs.readthedocs.io/en/latest/>`__ allows you to work
with S3 in the same way as any other supported filesystem.

Installing
----------

You can install S3FS from pip as follows:

::

    pip install fs-s3fs

Opening a S3FS
--------------

Open an S3FS by explicitly using the constructor:

.. code:: python

    from fs_s3fs import S3FS
    s3fs = S3FS('mybucket')

Or with a FS URL:

.. code:: python

      from fs import open_fs
      s3fs = open_fs('s3://mybucket')

Downloading Files
-----------------

To *download* files from an S3 bucket, open a file on the S3 filesystem
for reading, then write the data to a file on the local filesystem.
Here's an example that copies a file ``example.mov`` from S3 to your HD:

.. code:: python

    from fs.tools import copy_file_data
    with s3fs.open('example.mov', 'rb') as remote_file:
        with open('example.mov', 'wb') as local_file:
            copy_file_data(remote_file, local_file)

Although it is preferable to use the higher-level functionality in the
``fs.copy`` module. Here's an example:

.. code:: python

    from fs.copy import copy_file
    copy_file(s3fs, 'example.mov', './', 'example.mov')

Uploading Files
---------------

You can *upload* files in the same way. Simply copy a file from a source
filesystem to the S3 filesystem. See `Moving and
Copying <https://docs.pyfilesystem.org/en/latest/guide.html#moving-and-copying>`__
for more information.

ExtraArgs
---------

S3 objects have additional properties, beyond a traditional filesystem.
These options can be set using the ``upload_args`` and ``download_args``
properties. which are handed to upload and download methods, as
appropriate, for the lifetime of the filesystem instance.

For example, to set the ``cache-control`` header of all objects uploaded
to a bucket:

.. code:: python

    import fs, fs.mirror
    s3fs = S3FS('example', upload_args={"CacheControl": "max-age=2592000", "ACL": "public-read"})
    fs.mirror.mirror('/path/to/mirror', s3fs)

see `the Boto3
docs <https://boto3.readthedocs.io/en/latest/reference/customizations/s3.html#boto3.s3.transfer.S3Transfer.ALLOWED_UPLOAD_ARGS>`__
for more information.

``acl`` and ``cache_control`` are exposed explicitly for convenience,
and can be used in URLs. It is important to URL-Escape the
``cache_control`` value in a URL, as it may contain special characters.

.. code:: python

    import fs, fs.mirror
    with open fs.open_fs('s3://example?acl=public-read&cache_control=max-age%3D2592000%2Cpublic') as s3fs
        fs.mirror.mirror('/path/to/mirror', s3fs)

S3 URLs
-------

You can get a public URL to a file on a S3 bucket as follows:

.. code:: python

    movie_url = s3fs.geturl('example.mov')

Documentation
-------------

-  `PyFilesystem Wiki <https://www.pyfilesystem.org>`__
-  `S3FS Reference <http://fs-s3fs.readthedocs.io/en/latest/>`__
-  `PyFilesystem
   Reference <https://docs.pyfilesystem.org/en/latest/reference/base.html>`__

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/PyFilesystem/s3fs",
    "name": "fs-s3fs",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "pyfilesystem",
    "author": "Will McGugan",
    "author_email": "willmcgugan@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/be/f1/b641963f8694cdf7c3daed0276fef361c2b51717bf5f3e8a89b63c8ba237/fs-s3fs-1.1.1.tar.gz",
    "platform": "any",
    "description": "S3FS\n====\n\nS3FS is a `PyFilesystem <https://www.pyfilesystem.org/>`__ interface to\nAmazon S3 cloud storage.\n\nAs a PyFilesystem concrete class,\n`S3FS <http://fs-s3fs.readthedocs.io/en/latest/>`__ allows you to work\nwith S3 in the same way as any other supported filesystem.\n\nInstalling\n----------\n\nYou can install S3FS from pip as follows:\n\n::\n\n    pip install fs-s3fs\n\nOpening a S3FS\n--------------\n\nOpen an S3FS by explicitly using the constructor:\n\n.. code:: python\n\n    from fs_s3fs import S3FS\n    s3fs = S3FS('mybucket')\n\nOr with a FS URL:\n\n.. code:: python\n\n      from fs import open_fs\n      s3fs = open_fs('s3://mybucket')\n\nDownloading Files\n-----------------\n\nTo *download* files from an S3 bucket, open a file on the S3 filesystem\nfor reading, then write the data to a file on the local filesystem.\nHere's an example that copies a file ``example.mov`` from S3 to your HD:\n\n.. code:: python\n\n    from fs.tools import copy_file_data\n    with s3fs.open('example.mov', 'rb') as remote_file:\n        with open('example.mov', 'wb') as local_file:\n            copy_file_data(remote_file, local_file)\n\nAlthough it is preferable to use the higher-level functionality in the\n``fs.copy`` module. Here's an example:\n\n.. code:: python\n\n    from fs.copy import copy_file\n    copy_file(s3fs, 'example.mov', './', 'example.mov')\n\nUploading Files\n---------------\n\nYou can *upload* files in the same way. Simply copy a file from a source\nfilesystem to the S3 filesystem. See `Moving and\nCopying <https://docs.pyfilesystem.org/en/latest/guide.html#moving-and-copying>`__\nfor more information.\n\nExtraArgs\n---------\n\nS3 objects have additional properties, beyond a traditional filesystem.\nThese options can be set using the ``upload_args`` and ``download_args``\nproperties. which are handed to upload and download methods, as\nappropriate, for the lifetime of the filesystem instance.\n\nFor example, to set the ``cache-control`` header of all objects uploaded\nto a bucket:\n\n.. code:: python\n\n    import fs, fs.mirror\n    s3fs = S3FS('example', upload_args={\"CacheControl\": \"max-age=2592000\", \"ACL\": \"public-read\"})\n    fs.mirror.mirror('/path/to/mirror', s3fs)\n\nsee `the Boto3\ndocs <https://boto3.readthedocs.io/en/latest/reference/customizations/s3.html#boto3.s3.transfer.S3Transfer.ALLOWED_UPLOAD_ARGS>`__\nfor more information.\n\n``acl`` and ``cache_control`` are exposed explicitly for convenience,\nand can be used in URLs. It is important to URL-Escape the\n``cache_control`` value in a URL, as it may contain special characters.\n\n.. code:: python\n\n    import fs, fs.mirror\n    with open fs.open_fs('s3://example?acl=public-read&cache_control=max-age%3D2592000%2Cpublic') as s3fs\n        fs.mirror.mirror('/path/to/mirror', s3fs)\n\nS3 URLs\n-------\n\nYou can get a public URL to a file on a S3 bucket as follows:\n\n.. code:: python\n\n    movie_url = s3fs.geturl('example.mov')\n\nDocumentation\n-------------\n\n-  `PyFilesystem Wiki <https://www.pyfilesystem.org>`__\n-  `S3FS Reference <http://fs-s3fs.readthedocs.io/en/latest/>`__\n-  `PyFilesystem\n   Reference <https://docs.pyfilesystem.org/en/latest/reference/base.html>`__\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Amazon S3 filesystem for PyFilesystem2",
    "version": "1.1.1",
    "split_keywords": [
        "pyfilesystem"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "md5": "218e5e2d4f69be4642738e91afbfc209",
                "sha256": "9ba160eaa93390cc5992a857675666cb2fbb3721b872474dfdc659a715c39280"
            },
            "downloads": -1,
            "filename": "fs_s3fs-1.1.1-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "218e5e2d4f69be4642738e91afbfc209",
            "packagetype": "bdist_wheel",
            "python_version": "3.7",
            "requires_python": null,
            "size": 9675,
            "upload_time": "2019-08-14T10:59:32",
            "upload_time_iso_8601": "2019-08-14T10:59:32.547424Z",
            "url": "https://files.pythonhosted.org/packages/14/71/9b36a6dbd28386e2028e4ab9aac9c30874fc9c74d6b8fa0c8c2806548311/fs_s3fs-1.1.1-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "md5": "7c329e91485fe2cdc001edf961da9d76",
                "sha256": "b57f8c7664460ff7b451b4b44ca2ea9623a374d74e1284c2d5e6df499dc7976c"
            },
            "downloads": -1,
            "filename": "fs-s3fs-1.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "7c329e91485fe2cdc001edf961da9d76",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 10353,
            "upload_time": "2019-08-14T10:59:30",
            "upload_time_iso_8601": "2019-08-14T10:59:30.626771Z",
            "url": "https://files.pythonhosted.org/packages/be/f1/b641963f8694cdf7c3daed0276fef361c2b51717bf5f3e8a89b63c8ba237/fs-s3fs-1.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2019-08-14 10:59:30",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "PyFilesystem",
    "github_project": "s3fs",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "tox": true,
    "lcname": "fs-s3fs"
}
        
Elapsed time: 0.12545s