Name | backup-helper JSON |
Version |
0.2.2
JSON |
| download |
home_page | None |
Summary | Helper tool for creating plain-file cold-storage archives including checksum files |
upload_time | 2024-05-09 01:40:46 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.8 |
license | None |
keywords |
script
verify
backup
archival
bit-rot
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# BackupHelper
A tool for simplifying the process of archiving multiple directories
onto several different drives. For each directory a checksum file
will be created, which will be verified after the transfer.
You can stage multiple sources and add targets to them.
Once you're done you can start the transfer, which will run
all copy operations at the same time, while making sure that
all disks in a transfer aren't busy with another BackupHelper operation.
## Quick start
Add a directory as a source for copying/archiving:
```
python -m backup_helper stage ~/Documents --alias docs
Staged: /home/m/Documents
with alias: docs
```
By default the BackupHelper state will be saved in the file
`backup_status.json` in the current working directory.
Alternatively a custom path can be used by passing
`--status-file /path/to/status.json` to __each__ command.
Add targets to that source. Either the normalized absolute path
can be used as `source` or the alias (here: _"docs"_) if present:
```
$ python -m backup_helper add-target docs /media/storage1/docs_2024 --alias storage1
Added target /media/storage1/docs_2024
with alias: storage1
$ python -m backup_helper add-target docs /media/storage1/docs_2024 --alias storage2
Added target /media/storage1/docs_2024
with alias: storage2
```
Now you can use the command `start` run the whole backup process
in sequence.
```
python -m backup_helper start
18:22:01 - INFO - Wrote /home/m/Documents/Documents_bh_2024-02-25T18-22-01.cshd
...
18:22:02 - INFO -
NO MISSING FILES!
NO FAILED CHECKSUMS!
SUMMARY:
TOTAL FILES: 3
MATCHES: 3
FAILED CHECKSUMS: 0
MISSING: 0
...
18:22:02 - INFO - /home/m/Documents/Documents_bh_2024-02-25T18-22-01.cshd: No missing files and all files matching their hashes
...
18:22:02 - INFO - Successfully completed the following 5 operation(s):
Hashed '/home/m/Documents':
Hash file: /home/m/Documents/Documents_bh_2024-02-25T18-22-01.cshd
Transfer successful:
From: /home/m/Documents
To: /media/storage1/docs_2024
Transfer successful:
From: /home/m/Documents
To: /media/storage2/docs_2024
Verified transfer '/media/storage1/docs_2024':
Checked: 3
CRC Errors: 0
Missing: 0
Verified transfer '/media/storage2/docs_2024':
Checked: 3
CRC Errors: 0
Missing: 0
```
Each part of the backup process can be run on its own and on a
specific source/target combination only. For more information
see the [backup process section](#backup-process).
## Backup process
The backup process, which can be run automatically using the
`start` command is split into the subprocesses:
1) Hash all source directories. The checksum file will be added to
the directory. A log file of creating the checksum file will
be written next to status JSON file.
2) Transfer all sources to their targets. Only one read __or__ write
operation per disk will be allowed at the same time.
3) Verify the transfer by comparing the hashes of the generated
checksum file with the hashes of the files on the target.
A log of the verification process will be written to the target.
The verification process (3) will be run last if there are more
transfer operations on a disk, so:
1) More expensive write operations are performed first.
2) The transferred files are less likely to be in cache when hashing.
Each part of the backup process can be run on its own and/or on a
specific source/target combination only. Required previous steps
will be run automatically.
Using the `interactive` command it's possible to add sources/targets
while the transfer is running, otherwise all running operations would
need to be completed before executing further commands.
## Commands
See `python -m backup_helper --help`
Raw data
{
"_id": null,
"home_page": null,
"name": "backup-helper",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "script, verify, backup, archival, bit-rot",
"author": null,
"author_email": "omgitsmoe <60219950+omgitsmoe@users.noreply.github.com>",
"download_url": null,
"platform": null,
"description": "# BackupHelper\r\n\r\nA tool for simplifying the process of archiving multiple directories\r\nonto several different drives. For each directory a checksum file \r\nwill be created, which will be verified after the transfer.\r\n\r\nYou can stage multiple sources and add targets to them.\r\nOnce you're done you can start the transfer, which will run\r\nall copy operations at the same time, while making sure that\r\nall disks in a transfer aren't busy with another BackupHelper operation.\r\n\r\n## Quick start\r\n\r\nAdd a directory as a source for copying/archiving:\r\n```\r\npython -m backup_helper stage ~/Documents --alias docs\r\nStaged: /home/m/Documents\r\n with alias: docs\r\n```\r\n\r\nBy default the BackupHelper state will be saved in the file\r\n`backup_status.json` in the current working directory.\r\nAlternatively a custom path can be used by passing\r\n`--status-file /path/to/status.json` to __each__ command.\r\n\r\nAdd targets to that source. Either the normalized absolute path\r\ncan be used as `source` or the alias (here: _\"docs\"_) if present:\r\n\r\n```\r\n$ python -m backup_helper add-target docs /media/storage1/docs_2024 --alias storage1\r\nAdded target /media/storage1/docs_2024\r\n with alias: storage1\r\n$ python -m backup_helper add-target docs /media/storage1/docs_2024 --alias storage2\r\nAdded target /media/storage1/docs_2024\r\n with alias: storage2\r\n```\r\n\r\nNow you can use the command `start` run the whole backup process\r\nin sequence.\r\n\r\n```\r\npython -m backup_helper start\r\n18:22:01 - INFO - Wrote /home/m/Documents/Documents_bh_2024-02-25T18-22-01.cshd\r\n...\r\n18:22:02 - INFO - \r\n\r\nNO MISSING FILES!\r\n\r\nNO FAILED CHECKSUMS!\r\n\r\nSUMMARY:\r\n TOTAL FILES: 3\r\n MATCHES: 3\r\n FAILED CHECKSUMS: 0\r\n MISSING: 0\r\n\r\n...\r\n\r\n18:22:02 - INFO - /home/m/Documents/Documents_bh_2024-02-25T18-22-01.cshd: No missing files and all files matching their hashes\r\n\r\n...\r\n\r\n18:22:02 - INFO - Successfully completed the following 5 operation(s):\r\nHashed '/home/m/Documents':\r\n Hash file: /home/m/Documents/Documents_bh_2024-02-25T18-22-01.cshd\r\nTransfer successful:\r\n From: /home/m/Documents\r\n To: /media/storage1/docs_2024\r\nTransfer successful:\r\n From: /home/m/Documents\r\n To: /media/storage2/docs_2024\r\nVerified transfer '/media/storage1/docs_2024':\r\n Checked: 3\r\n CRC Errors: 0\r\n Missing: 0\r\nVerified transfer '/media/storage2/docs_2024':\r\n Checked: 3\r\n CRC Errors: 0\r\n Missing: 0\r\n```\r\n\r\nEach part of the backup process can be run on its own and on a\r\nspecific source/target combination only. For more information\r\nsee the [backup process section](#backup-process).\r\n\r\n## Backup process\r\n\r\nThe backup process, which can be run automatically using the\r\n`start` command is split into the subprocesses:\r\n\r\n1) Hash all source directories. The checksum file will be added to\r\n the directory. A log file of creating the checksum file will\r\n be written next to status JSON file.\r\n2) Transfer all sources to their targets. Only one read __or__ write\r\n operation per disk will be allowed at the same time.\r\n3) Verify the transfer by comparing the hashes of the generated\r\n checksum file with the hashes of the files on the target.\r\n A log of the verification process will be written to the target.\r\n\r\nThe verification process (3) will be run last if there are more\r\ntransfer operations on a disk, so:\r\n\r\n1) More expensive write operations are performed first.\r\n2) The transferred files are less likely to be in cache when hashing.\r\n\r\nEach part of the backup process can be run on its own and/or on a\r\nspecific source/target combination only. Required previous steps\r\nwill be run automatically.\r\n\r\nUsing the `interactive` command it's possible to add sources/targets\r\nwhile the transfer is running, otherwise all running operations would\r\nneed to be completed before executing further commands.\r\n\r\n## Commands\r\n\r\nSee `python -m backup_helper --help`\r\n",
"bugtrack_url": null,
"license": null,
"summary": "Helper tool for creating plain-file cold-storage archives including checksum files",
"version": "0.2.2",
"project_urls": {
"Bug Tracker": "https://github.com/omgitsmoe/backup_helper/issues",
"Homepage": "https://github.com/omgitsmoe/backup_helper"
},
"split_keywords": [
"script",
" verify",
" backup",
" archival",
" bit-rot"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "894c5ba40cb0f34a1ccde39986bea5548bb0cb86a7ad98c37fdfc4c693e78143",
"md5": "6b6b6a4bd66a7417d2095a06f577adb2",
"sha256": "d222be37df89d66dc43d3d0a5a5f3d3f90d3cbebfc9281831a26f38031d54db4"
},
"downloads": -1,
"filename": "backup_helper-0.2.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "6b6b6a4bd66a7417d2095a06f577adb2",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 22787,
"upload_time": "2024-05-09T01:40:46",
"upload_time_iso_8601": "2024-05-09T01:40:46.199487Z",
"url": "https://files.pythonhosted.org/packages/89/4c/5ba40cb0f34a1ccde39986bea5548bb0cb86a7ad98c37fdfc4c693e78143/backup_helper-0.2.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-05-09 01:40:46",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "omgitsmoe",
"github_project": "backup_helper",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "backup-helper"
}