opt-s4


Nameopt-s4 JSON
Version 1.0.1 PyPI version JSON
download
home_pagehttps://github.com/MichaelAquilina/S4
SummaryFast and cheap synchronisation of files using Amazon S3
upload_time2023-04-10 02:12:20
maintainer
docs_urlNone
authororigin author: Michael Aquilina, covered by Minghuan Ma
requires_python>=3.5
licenseGPLv3
keywords aws s3 backup sync
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ==============
S4 = S3 Syncer
==============

|CircleCI| |CodeCov| |Black| |PyPi| |GPLv3|

Fast and cheap synchronisation of files using `Amazon
S3 <https://aws.amazon.com/s3/>`__.

S4 stands for "Simple Storage Solution (S3) Syncer".

The intention of this project is to be an open source alternative to
typical proprietary sync solutions like Dropbox. Because S4 interacts
with S3 directly, you can expect *very* fast upload and download speeds
as well as *very* cheap costs (See `Amazon S3
Pricing <https://aws.amazon.com/s3/pricing/>`__ for an idea of how much
this would cost you). See `Why?`_ for further motivation for this project.

You can also take advantage of other cool features that S3 provides like
`versioning <http://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html>`__.
Everytime you sync a version of a new file, you will now have the
ability to easily rollback to any previous version.

* Requirements_
* Installation_
* Setup_
* Synchronising_
* `Handling Conflicts`_
* `Other Subcommands`_
* `How S4 Works`_
* `Ignoring Files`_
* `Why?`_
* Contributing_

See it in action here:

|ASCIINEMA|

Requirements
------------

S4 requires python 3.5+ to work

Installation
------------

The easiest way to install S4 is through pip:

::

    $ pip install s4

You will need ``libmagic`` installed.
This is installed by default on most linux distributions but on MacOSX you need to install it with brew as follows:

::

    brew install libmagic

Setup
-----

Run ``s4 add`` to add a new sync local folder and target S3 uri:

::

    $ s4 add
    local folder: /home/username/myfolder1
    s3 uri: s3://mybucket/folder1
    AWS Access Key ID: AKIAJD53D9GCGKCD
    AWS Secret Access Key:
    region name: eu-west-2
    Provide a name for this entry [myfolder1]:

Synchronising
-------------

Run ``s4 sync`` in the project directory to synchronise the local
folders you specified with the folders in the bucket.

::

    $ s4 sync
    Syncing myfolder1 [/home/username/myfolder1/ <=> s3://mybucket/folder1/]
    Creating foobar.jpg (/home/username/myfolder1/ => s3://mybucket/folder1/)
    Creating boarding-pass.pdf (/home/username/myfolder1/ => s3://mybucket/folder1/)
    Flushing Index to Storage

All files will be automatically synced between the source and target
destinations where possible.

You may specify a specific folder to synchronise by the name you
provided during ``add``.

::

    $ s4 sync myfolder1


If you wish to synchronise your targets continuously, use the ``daemon`` command:

::

    $ s4 daemon myfolder1

NOTE: This command is only supported on machines that can run INotify. This typically means
Linux based operating systems.


Handling Conflicts
------------------

In the case where S4 cannot decide on a reasonable action by itself, it
will ask you to intervene:

::

    Syncing /home/username/myfolder1/ with s3://mybucket/folder1/

    Conflict for "test.txt". Which version would you like to keep?
       (1) /home/username/myfolder1/test.txt updated at 2017-01-23 12:26:24 (CREATED)
       (2) s3://mybucket/folder1/test.txt updated at 2017-01-23 12:26:30 (CREATED)
       (d) View difference (requires diff command)
       (X) Skip this file

    Choice (default=skip):

If you do not wish to fix the issue, you can simply skip the file for
now.

Other Subcommands
-----------------

Some other subcommands that you could find useful:

-  ``s4 targets`` - print existing targets
-  ``s4 edit`` - edit the settings of a targets
-  ``s4 rm`` - remove a target
-  ``s4 ls`` - print tracked files and metadata of a target

Use the ``--help`` parameter on each subcommand to get more details.

How S4 Works
------------

S4 keeps track of changes between files with a ``.index`` file at
the root of each folder you are syncing. This contains the keys of each
file being synchronised along with the timestamps found locally and
remotely in JSON format.

This is compressed (currently using gzip) to save space and increase
performance when loading.

If you are curious, you can view the contents of an index file using the
``s4 ls`` subcommand or you can view the file directly using a command
like ``zcat``.

    NOTE: Deleting this file will result in that folder being treated as if
    it was never synced before so make sure you *do not* delete it unless
    you know what you are doing.

All information about your configuration (such as targets, your keys etc..) are
stored in a JSON formatted file under ``~/.config/s4/sync.conf``.

Ignoring Files
--------------

Create a ``.syncignore`` file in the root of the directory being synced
to list patterns of subdirectories and files you wish to ignore. The
``.syncignore`` file uses the exact same pattern that you would expect
in ``.gitignore``. Each line specifies a `GLOB
pattern <https://en.wikipedia.org/wiki/Glob_%28programming%29>`__ to
ignore during sync.

Note that if you add a pattern which matches an item that was previously
synced, that item will be deleted from the target you are syncing with
next time you run S4.

Why?
----

There are a number of open source S3 backup tools out there - but none of them really satisfied the
requirements that this project tries to solve.

Here are is a list of open source solutions that I have tried in the past.

* ``s3cmd``: Provides a sync function that works very well for backing up - but stops working correctly
  as soon as there is second machine you want to sync to S3.

* ``owncloud/nextcloud``: Requires you to setup a server to perform your syncing. In terms of costs on AWS,
  this quickly becomes costly compared with just using S3. The speed of your uploads and downloads are also
  heavily bottlenectked by the network and hardware performance of your ec2 instance.

* ``seafile``: suffers from the same problem as owncloud/nextcloud.

* ``duplicity``: great backup tool, but does not provide a sync solution of any kind.

Contributing
------------

Pull requests are welcome! Make sure you pass all the tests, CircleCI will tell you if you don't ;)

When opening a pull request, please make sure it is from a separate branch in your fork.

.. |CircleCI| image:: https://circleci.com/gh/MichaelAquilina/S4.svg?style=svg
   :target: https://circleci.com/gh/MichaelAquilina/S4

.. |PyPi| image:: https://badge.fury.io/py/s4.svg
   :target: https://badge.fury.io/py/s4

.. |CodeCov| image:: https://codecov.io/gh/MichaelAquilina/s4/branch/master/graph/badge.svg
   :target: https://codecov.io/gh/MichaelAquilina/s4

.. |GPLv3| image:: https://img.shields.io/badge/License-GPL%20v3-blue.svg
   :target: https://www.gnu.org/licenses/gpl-3.0

.. |ASCIINEMA| image:: https://asciinema.org/a/0GiXLN7YT4ki8qouedF0w8Wbk.png
   :target: https://asciinema.org/a/0GiXLN7YT4ki8qouedF0w8Wbk?autoplay=1

.. |Black| image:: https://img.shields.io/badge/code%20style-black-000000.svg
   :target: https://github.com/ambv/black

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/MichaelAquilina/S4",
    "name": "opt-s4",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.5",
    "maintainer_email": "",
    "keywords": "aws s3 backup sync",
    "author": "origin author: Michael Aquilina, covered by Minghuan Ma",
    "author_email": "maming3@xiaomi.com",
    "download_url": "https://files.pythonhosted.org/packages/ab/1d/bb8f6ed379ec87f3ee18031c35a4675e3b298a4a8ee40deaf5152d1f5699/opt-s4-1.0.1.tar.gz",
    "platform": null,
    "description": "==============\nS4 = S3 Syncer\n==============\n\n|CircleCI| |CodeCov| |Black| |PyPi| |GPLv3|\n\nFast and cheap synchronisation of files using `Amazon\nS3 <https://aws.amazon.com/s3/>`__.\n\nS4 stands for \"Simple Storage Solution (S3) Syncer\".\n\nThe intention of this project is to be an open source alternative to\ntypical proprietary sync solutions like Dropbox. Because S4 interacts\nwith S3 directly, you can expect *very* fast upload and download speeds\nas well as *very* cheap costs (See `Amazon S3\nPricing <https://aws.amazon.com/s3/pricing/>`__ for an idea of how much\nthis would cost you). See `Why?`_ for further motivation for this project.\n\nYou can also take advantage of other cool features that S3 provides like\n`versioning <http://docs.aws.amazon.com/AmazonS3/latest/dev/Versioning.html>`__.\nEverytime you sync a version of a new file, you will now have the\nability to easily rollback to any previous version.\n\n* Requirements_\n* Installation_\n* Setup_\n* Synchronising_\n* `Handling Conflicts`_\n* `Other Subcommands`_\n* `How S4 Works`_\n* `Ignoring Files`_\n* `Why?`_\n* Contributing_\n\nSee it in action here:\n\n|ASCIINEMA|\n\nRequirements\n------------\n\nS4 requires python 3.5+ to work\n\nInstallation\n------------\n\nThe easiest way to install S4 is through pip:\n\n::\n\n    $ pip install s4\n\nYou will need ``libmagic`` installed.\nThis is installed by default on most linux distributions but on MacOSX you need to install it with brew as follows:\n\n::\n\n    brew install libmagic\n\nSetup\n-----\n\nRun ``s4 add`` to add a new sync local folder and target S3 uri:\n\n::\n\n    $ s4 add\n    local folder: /home/username/myfolder1\n    s3 uri: s3://mybucket/folder1\n    AWS Access Key ID: AKIAJD53D9GCGKCD\n    AWS Secret Access Key:\n    region name: eu-west-2\n    Provide a name for this entry [myfolder1]:\n\nSynchronising\n-------------\n\nRun ``s4 sync`` in the project directory to synchronise the local\nfolders you specified with the folders in the bucket.\n\n::\n\n    $ s4 sync\n    Syncing myfolder1 [/home/username/myfolder1/ <=> s3://mybucket/folder1/]\n    Creating foobar.jpg (/home/username/myfolder1/ => s3://mybucket/folder1/)\n    Creating boarding-pass.pdf (/home/username/myfolder1/ => s3://mybucket/folder1/)\n    Flushing Index to Storage\n\nAll files will be automatically synced between the source and target\ndestinations where possible.\n\nYou may specify a specific folder to synchronise by the name you\nprovided during ``add``.\n\n::\n\n    $ s4 sync myfolder1\n\n\nIf you wish to synchronise your targets continuously, use the ``daemon`` command:\n\n::\n\n    $ s4 daemon myfolder1\n\nNOTE: This command is only supported on machines that can run INotify. This typically means\nLinux based operating systems.\n\n\nHandling Conflicts\n------------------\n\nIn the case where S4 cannot decide on a reasonable action by itself, it\nwill ask you to intervene:\n\n::\n\n    Syncing /home/username/myfolder1/ with s3://mybucket/folder1/\n\n    Conflict for \"test.txt\". Which version would you like to keep?\n       (1) /home/username/myfolder1/test.txt updated at 2017-01-23 12:26:24 (CREATED)\n       (2) s3://mybucket/folder1/test.txt updated at 2017-01-23 12:26:30 (CREATED)\n       (d) View difference (requires diff command)\n       (X) Skip this file\n\n    Choice (default=skip):\n\nIf you do not wish to fix the issue, you can simply skip the file for\nnow.\n\nOther Subcommands\n-----------------\n\nSome other subcommands that you could find useful:\n\n-  ``s4 targets`` - print existing targets\n-  ``s4 edit`` - edit the settings of a targets\n-  ``s4 rm`` - remove a target\n-  ``s4 ls`` - print tracked files and metadata of a target\n\nUse the ``--help`` parameter on each subcommand to get more details.\n\nHow S4 Works\n------------\n\nS4 keeps track of changes between files with a ``.index`` file at\nthe root of each folder you are syncing. This contains the keys of each\nfile being synchronised along with the timestamps found locally and\nremotely in JSON format.\n\nThis is compressed (currently using gzip) to save space and increase\nperformance when loading.\n\nIf you are curious, you can view the contents of an index file using the\n``s4 ls`` subcommand or you can view the file directly using a command\nlike ``zcat``.\n\n    NOTE: Deleting this file will result in that folder being treated as if\n    it was never synced before so make sure you *do not* delete it unless\n    you know what you are doing.\n\nAll information about your configuration (such as targets, your keys etc..) are\nstored in a JSON formatted file under ``~/.config/s4/sync.conf``.\n\nIgnoring Files\n--------------\n\nCreate a ``.syncignore`` file in the root of the directory being synced\nto list patterns of subdirectories and files you wish to ignore. The\n``.syncignore`` file uses the exact same pattern that you would expect\nin ``.gitignore``. Each line specifies a `GLOB\npattern <https://en.wikipedia.org/wiki/Glob_%28programming%29>`__ to\nignore during sync.\n\nNote that if you add a pattern which matches an item that was previously\nsynced, that item will be deleted from the target you are syncing with\nnext time you run S4.\n\nWhy?\n----\n\nThere are a number of open source S3 backup tools out there - but none of them really satisfied the\nrequirements that this project tries to solve.\n\nHere are is a list of open source solutions that I have tried in the past.\n\n* ``s3cmd``: Provides a sync function that works very well for backing up - but stops working correctly\n  as soon as there is second machine you want to sync to S3.\n\n* ``owncloud/nextcloud``: Requires you to setup a server to perform your syncing. In terms of costs on AWS,\n  this quickly becomes costly compared with just using S3. The speed of your uploads and downloads are also\n  heavily bottlenectked by the network and hardware performance of your ec2 instance.\n\n* ``seafile``: suffers from the same problem as owncloud/nextcloud.\n\n* ``duplicity``: great backup tool, but does not provide a sync solution of any kind.\n\nContributing\n------------\n\nPull requests are welcome! Make sure you pass all the tests, CircleCI will tell you if you don't ;)\n\nWhen opening a pull request, please make sure it is from a separate branch in your fork.\n\n.. |CircleCI| image:: https://circleci.com/gh/MichaelAquilina/S4.svg?style=svg\n   :target: https://circleci.com/gh/MichaelAquilina/S4\n\n.. |PyPi| image:: https://badge.fury.io/py/s4.svg\n   :target: https://badge.fury.io/py/s4\n\n.. |CodeCov| image:: https://codecov.io/gh/MichaelAquilina/s4/branch/master/graph/badge.svg\n   :target: https://codecov.io/gh/MichaelAquilina/s4\n\n.. |GPLv3| image:: https://img.shields.io/badge/License-GPL%20v3-blue.svg\n   :target: https://www.gnu.org/licenses/gpl-3.0\n\n.. |ASCIINEMA| image:: https://asciinema.org/a/0GiXLN7YT4ki8qouedF0w8Wbk.png\n   :target: https://asciinema.org/a/0GiXLN7YT4ki8qouedF0w8Wbk?autoplay=1\n\n.. |Black| image:: https://img.shields.io/badge/code%20style-black-000000.svg\n   :target: https://github.com/ambv/black\n",
    "bugtrack_url": null,
    "license": "GPLv3",
    "summary": "Fast and cheap synchronisation of files using Amazon S3",
    "version": "1.0.1",
    "split_keywords": [
        "aws",
        "s3",
        "backup",
        "sync"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1d95c8e5d9251d08cb22a189e3cdfce6d9451393f6324d03ffc9ba54094f4edf",
                "md5": "fa378e7ab76a1b217620f0875499b030",
                "sha256": "cffb1dc7c142a20244cf62e959127243ae5fd463d870c9724d46c024c09e7b79"
            },
            "downloads": -1,
            "filename": "opt_s4-1.0.1-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "fa378e7ab76a1b217620f0875499b030",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": ">=3.5",
            "size": 37180,
            "upload_time": "2023-04-10T02:12:18",
            "upload_time_iso_8601": "2023-04-10T02:12:18.310820Z",
            "url": "https://files.pythonhosted.org/packages/1d/95/c8e5d9251d08cb22a189e3cdfce6d9451393f6324d03ffc9ba54094f4edf/opt_s4-1.0.1-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ab1dbb8f6ed379ec87f3ee18031c35a4675e3b298a4a8ee40deaf5152d1f5699",
                "md5": "43915b07ef9e7b79099399e291274640",
                "sha256": "86eebed79cb8fff9191c2e6f7ff37b1cb3c9e57fc75674f4fbb8c0cb716ac563"
            },
            "downloads": -1,
            "filename": "opt-s4-1.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "43915b07ef9e7b79099399e291274640",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.5",
            "size": 50397,
            "upload_time": "2023-04-10T02:12:20",
            "upload_time_iso_8601": "2023-04-10T02:12:20.684170Z",
            "url": "https://files.pythonhosted.org/packages/ab/1d/bb8f6ed379ec87f3ee18031c35a4675e3b298a4a8ee40deaf5152d1f5699/opt-s4-1.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-04-10 02:12:20",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "MichaelAquilina",
    "github_project": "S4",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "circle": true,
    "requirements": [],
    "tox": true,
    "lcname": "opt-s4"
}
        
Elapsed time: 0.15870s