Name | scrapy-s3logstorage JSON |
Version |
0.1.0
JSON |
| download |
home_page | |
Summary | Upload scrapy logs to S3 |
upload_time | 2023-06-01 04:46:32 |
maintainer | |
docs_url | None |
author | Nicholas Mischke |
requires_python | >=3.7,<4.0 |
license | |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Scrapy S3 Log Storage
## Description
A Scrapy extension to upload log files to S3.
If you're already exporting your feeds to S3 this extension uses many of the same settings. After adding this extension to your `EXTENSIONS` setting, just set the `LOG_FILE`, `S3_LOG_BUCKET` and an optional `S3_LOG_ACL`settings and you're good to go.
## Installation
You can install scrapy-s3logstorage using pip:
```
pip install scrapy-s3logstorage
```
## Configuration
This extension still requires that a local log file is written. Once scrapy's engine has stopped, the extension will upload the log file to S3 and optionally delete the local file.
Enable the extension by adding it to your `settings.py`:
```
from environs import Env
env = Env()
env.read_env()
EXTENSIONS = {
'scrapy_s3logstorage.extension.S3LogStorage': 0,
}
LOG_FILE = 'scrapy.log' # Must be a local file
S3_LOG_BUCKET = 'my-bucket' # Bucket name to store logs
S3_LOG_DELETE_LOCAL = True # Delete local log file after upload, defaults to False
# If AWS CLI is configured, and you're using the same credentials the following settings are optional
AWS_ACCESS_KEY_ID = env("AWS_ACCESS_KEY_ID")
AWS_SECRET_ACCESS_KEY = env("AWS_SECRET_ACCESS_KEY")
AWS_SESSION_TOKEN = env("AWS_SESSION_TOKEN") # if required
AWS_ENDPOINT_URL = None # or your endpoint URL
# S3_LOG_ACL takes priority over FEED_STORAGE_S3_ACL.
# If S3_LOG_ACL is not set, FEED_STORAGE_S3_ACL will be used.
# Setting one or both of these settings is optional.
S3_LOG_ACL = '' # or other S3 ACL
FEED_STORAGE_S3_ACL = '' # or other S3 ACL
```
Raw data
{
"_id": null,
"home_page": "",
"name": "scrapy-s3logstorage",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.7,<4.0",
"maintainer_email": "",
"keywords": "",
"author": "Nicholas Mischke",
"author_email": "nmischkework@proton.me",
"download_url": "https://files.pythonhosted.org/packages/dc/9a/a8466bab43c3242917a05f7f363839a44fb96a93c5c722ace471bd95e261/scrapy_s3logstorage-0.1.0.tar.gz",
"platform": null,
"description": "\n# Scrapy S3 Log Storage\n\n## Description\nA Scrapy extension to upload log files to S3.\n\nIf you're already exporting your feeds to S3 this extension uses many of the same settings. After adding this extension to your `EXTENSIONS` setting, just set the `LOG_FILE`, `S3_LOG_BUCKET` and an optional `S3_LOG_ACL`settings and you're good to go.\n\n## Installation\nYou can install scrapy-s3logstorage using pip:\n```\n pip install scrapy-s3logstorage\n```\n\n## Configuration\n\nThis extension still requires that a local log file is written. Once scrapy's engine has stopped, the extension will upload the log file to S3 and optionally delete the local file.\n\nEnable the extension by adding it to your `settings.py`:\n```\n from environs import Env\n\n env = Env() \n env.read_env() \n\n EXTENSIONS = {\n 'scrapy_s3logstorage.extension.S3LogStorage': 0,\n }\n\n LOG_FILE = 'scrapy.log' # Must be a local file\n S3_LOG_BUCKET = 'my-bucket' # Bucket name to store logs\n S3_LOG_DELETE_LOCAL = True # Delete local log file after upload, defaults to False\n\n # If AWS CLI is configured, and you're using the same credentials the following settings are optional\n AWS_ACCESS_KEY_ID = env(\"AWS_ACCESS_KEY_ID\")\n AWS_SECRET_ACCESS_KEY = env(\"AWS_SECRET_ACCESS_KEY\")\n AWS_SESSION_TOKEN = env(\"AWS_SESSION_TOKEN\") # if required\n AWS_ENDPOINT_URL = None # or your endpoint URL\n\n # S3_LOG_ACL takes priority over FEED_STORAGE_S3_ACL.\n # If S3_LOG_ACL is not set, FEED_STORAGE_S3_ACL will be used.\n # Setting one or both of these settings is optional.\n\n S3_LOG_ACL = '' # or other S3 ACL\n FEED_STORAGE_S3_ACL = '' # or other S3 ACL\n```",
"bugtrack_url": null,
"license": "",
"summary": "Upload scrapy logs to S3",
"version": "0.1.0",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ccdcdd351b4d55fd96be8e34029bee5e5a341b25bc0b3cafa6b71274d275c911",
"md5": "4a24a6e4f8e8b26b8f45add720ce0ef5",
"sha256": "6e9ce3aac31d145f5f750f625fa062e5a49ff895349c4257bf7f4e4c8ec55713"
},
"downloads": -1,
"filename": "scrapy_s3logstorage-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "4a24a6e4f8e8b26b8f45add720ce0ef5",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7,<4.0",
"size": 4119,
"upload_time": "2023-06-01T04:46:30",
"upload_time_iso_8601": "2023-06-01T04:46:30.619757Z",
"url": "https://files.pythonhosted.org/packages/cc/dc/dd351b4d55fd96be8e34029bee5e5a341b25bc0b3cafa6b71274d275c911/scrapy_s3logstorage-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "dc9aa8466bab43c3242917a05f7f363839a44fb96a93c5c722ace471bd95e261",
"md5": "30b00f87b467286d6e5b97715b2b8558",
"sha256": "e84294fff20193533e77262f5e87d36dcd0087727fedf2228f021b4a5d9a1d0e"
},
"downloads": -1,
"filename": "scrapy_s3logstorage-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "30b00f87b467286d6e5b97715b2b8558",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7,<4.0",
"size": 3317,
"upload_time": "2023-06-01T04:46:32",
"upload_time_iso_8601": "2023-06-01T04:46:32.442895Z",
"url": "https://files.pythonhosted.org/packages/dc/9a/a8466bab43c3242917a05f7f363839a44fb96a93c5c722ace471bd95e261/scrapy_s3logstorage-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-06-01 04:46:32",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "scrapy-s3logstorage"
}