# S3 Upload Script
Python script for uploading files to an Amazon S3 bucket. It supports uploading multiple files, specifying an upload directory, and filtering files based on file patterns.
Main focus of this project is for usage in CI/CD pipelines and it is published like PYPI package to https://pypi.org/project/s3uploader-ci-cd/.
## Requirements
- Python 3.9 or higher
- `boto3` package for AWS S3 communication
- `python-dotenv` package for loading environment variables from a .env file
## Installation
Install the required packages using pip:
```powershell
pip install s3uploader-ci-cd
```
# Usage
* Set up your AWS credentials like enviroment variables:
```powershell
AWS_ACCESS_KEY_ID=your_access_key_id
AWS_SECRET_ACCESS_KEY=your_secret_access_key
```
Replace your_access_key_id and your_secret_access_key with your actual AWS access key and secret key.
* Run the script with the required command-line arguments:
```powershell
python s3upload.py --bucket_name BUCKET_NAME --upload_prefix UPLOAD_PREFIX --source_dir SOURCE_DIR --include INCLUDE_PATTERN
```
Replace BUCKET_NAME with the name of your S3 bucket, region, UPLOAD_PREFIX with the desired prefix for the uploaded files, SOURCE_DIR with the relative path of the directory containing the files for upload, and INCLUDE_PATTERN with a comma-separated list of file patterns to include in the upload.
```powershell
python s3upload.py --bucket_name my-bucket -- region my_region --upload_prefix my-prefix --source_dir my-files --include "*.txt,*.pdf"
```
* This command will upload all .txt and .pdf files from the my-files directory to the my-bucket S3 bucket with the my-prefix prefix.
## Gitlab CI/CD pipeline
* .gitlab-ci.yml
```yaml
publish-to-s3:
stage: publish
tags:
- docker-linux
image: python:3.11
before_script:
# Set the environment variables for AWS credentials
- export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID
- export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
- pip install --upgrade s3uploader
script:
- python3 -m s3uploader --bucket_name bucket_name --source_dir src --include file.json --upload_prefix test/my_path
```
### Command-line Arguments
- `--bucket_name`: The name of the S3 bucket
- `--region`: Region (default: 'eu-west-1')
- `--upload_prefix`: The S3 object key prefix for the uploaded files
- `--upload_prefix_config_file`: The path to the output_path config file containing the upload prefix (default: 'output_path.txt')
- `--source_dir`: The relative path of the directory containing the files for upload (default: 'dist/')
- `--include`: A comma-separated string of file patterns to include in the upload (default: '*')
- `--exclude`: A comma-separated string of file patterns to exclude in the upload (default: '')
### Functions
The script includes the following functions:
- `comma_separated_string(string: str)`: Converts a comma-separated string into a list of strings.
- `parse_args(sys_args)`: Parses command-line arguments for the script.
- `upload_file(bucket_name: str, file_path: str, key: str)`: Uploads a file to an AWS S3 bucket using the regular upload method.
- `get_files_to_upload(source_path: pathlib.Path, include_pattern: list[str])`: Retrieves a list of files in the source directory that match the include patterns.
- `upload_files_to_s3(bucket_name: str, files: list[pathlib.Path], upload_prefix: str, source_path: pathlib.Path)`: Uploads each file in the given list to an AWS S3 bucket.
- `construct_source_path_for_upload(source_dir: str)`: Constructs the absolute path for the source directory of files to be uploaded.
- `construct_upload_prefix(upload_prefix: str, output_path_config: pathlib.Path)`: Constructs the final upload prefix for the files in the AWS S3 bucket.
- `main(bucket_name: str, upload_prefix: str, upload_prefix_config_file: str, source_dir: str, include_pattern: str)`: Main function that uploads files to an AWS S3 bucket.
## Development
Documentation about developement setup for this project [CONTRIBUTING](CONTRIBUTING.md).
Raw data
{
"_id": null,
"home_page": "https://github.com/Mellkori/s3uploader",
"name": "s3uploader-ci-cd",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.9,<4.0",
"maintainer_email": "",
"keywords": "",
"author": "Mellkori",
"author_email": "milos.slavik@outlook.com",
"download_url": "https://files.pythonhosted.org/packages/af/b9/08c1bb361e2b82fda8a0accb7d264e3aff25e046902ce20d784dcf1c05e0/s3uploader_ci_cd-1.0.4.tar.gz",
"platform": null,
"description": "# S3 Upload Script\n\nPython script for uploading files to an Amazon S3 bucket. It supports uploading multiple files, specifying an upload directory, and filtering files based on file patterns.\nMain focus of this project is for usage in CI/CD pipelines and it is published like PYPI package to https://pypi.org/project/s3uploader-ci-cd/.\n\n## Requirements\n\n- Python 3.9 or higher\n- `boto3` package for AWS S3 communication\n- `python-dotenv` package for loading environment variables from a .env file\n\n## Installation\n\nInstall the required packages using pip:\n\n```powershell\npip install s3uploader-ci-cd\n```\n# Usage\n\n* Set up your AWS credentials like enviroment variables:\n```powershell\nAWS_ACCESS_KEY_ID=your_access_key_id\nAWS_SECRET_ACCESS_KEY=your_secret_access_key\n```\nReplace your_access_key_id and your_secret_access_key with your actual AWS access key and secret key.\n\n* Run the script with the required command-line arguments:\n ```powershell\npython s3upload.py --bucket_name BUCKET_NAME --upload_prefix UPLOAD_PREFIX --source_dir SOURCE_DIR --include INCLUDE_PATTERN\n```\nReplace BUCKET_NAME with the name of your S3 bucket, region, UPLOAD_PREFIX with the desired prefix for the uploaded files, SOURCE_DIR with the relative path of the directory containing the files for upload, and INCLUDE_PATTERN with a comma-separated list of file patterns to include in the upload.\n ```powershell\npython s3upload.py --bucket_name my-bucket -- region my_region --upload_prefix my-prefix --source_dir my-files --include \"*.txt,*.pdf\"\n```\n* This command will upload all .txt and .pdf files from the my-files directory to the my-bucket S3 bucket with the my-prefix prefix.\n\n## Gitlab CI/CD pipeline\n\n* .gitlab-ci.yml\n\n```yaml\npublish-to-s3:\n stage: publish\n tags:\n - docker-linux\n image: python:3.11\n before_script:\n # Set the environment variables for AWS credentials\n - export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID\n - export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY\n - pip install --upgrade s3uploader\n script:\n - python3 -m s3uploader --bucket_name bucket_name --source_dir src --include file.json --upload_prefix test/my_path\n```\n\n### Command-line Arguments\n\n- `--bucket_name`: The name of the S3 bucket\n- `--region`: Region (default: 'eu-west-1')\n- `--upload_prefix`: The S3 object key prefix for the uploaded files\n- `--upload_prefix_config_file`: The path to the output_path config file containing the upload prefix (default: 'output_path.txt')\n- `--source_dir`: The relative path of the directory containing the files for upload (default: 'dist/')\n- `--include`: A comma-separated string of file patterns to include in the upload (default: '*')\n- `--exclude`: A comma-separated string of file patterns to exclude in the upload (default: '')\n\n### Functions\n\nThe script includes the following functions:\n\n- `comma_separated_string(string: str)`: Converts a comma-separated string into a list of strings.\n- `parse_args(sys_args)`: Parses command-line arguments for the script.\n- `upload_file(bucket_name: str, file_path: str, key: str)`: Uploads a file to an AWS S3 bucket using the regular upload method.\n- `get_files_to_upload(source_path: pathlib.Path, include_pattern: list[str])`: Retrieves a list of files in the source directory that match the include patterns.\n- `upload_files_to_s3(bucket_name: str, files: list[pathlib.Path], upload_prefix: str, source_path: pathlib.Path)`: Uploads each file in the given list to an AWS S3 bucket.\n- `construct_source_path_for_upload(source_dir: str)`: Constructs the absolute path for the source directory of files to be uploaded.\n- `construct_upload_prefix(upload_prefix: str, output_path_config: pathlib.Path)`: Constructs the final upload prefix for the files in the AWS S3 bucket.\n- `main(bucket_name: str, upload_prefix: str, upload_prefix_config_file: str, source_dir: str, include_pattern: str)`: Main function that uploads files to an AWS S3 bucket.\n\n## Development\n\nDocumentation about developement setup for this project [CONTRIBUTING](CONTRIBUTING.md).\n",
"bugtrack_url": null,
"license": "",
"summary": "S3 Uploader for CI/CD pipeline",
"version": "1.0.4",
"project_urls": {
"Homepage": "https://github.com/Mellkori/s3uploader"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "9dc49971594d027f56072e810b869427fdbd6f26b9d24a4e666c05fbcce53b03",
"md5": "06b4bdc56277aabfc0d1161ea00565a6",
"sha256": "e3a4ff42beb06a51dbc0effcf53ffe43b8c8feb6cc1512453dc4a3d8e6233072"
},
"downloads": -1,
"filename": "s3uploader_ci_cd-1.0.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "06b4bdc56277aabfc0d1161ea00565a6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9,<4.0",
"size": 6812,
"upload_time": "2023-06-11T14:28:00",
"upload_time_iso_8601": "2023-06-11T14:28:00.261738Z",
"url": "https://files.pythonhosted.org/packages/9d/c4/9971594d027f56072e810b869427fdbd6f26b9d24a4e666c05fbcce53b03/s3uploader_ci_cd-1.0.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "afb908c1bb361e2b82fda8a0accb7d264e3aff25e046902ce20d784dcf1c05e0",
"md5": "cbe59d4657ed092963a50b3d6d360142",
"sha256": "3639e551e87e67af5b2cbac774095a2377209163690dd8031b25e3a72c57faf5"
},
"downloads": -1,
"filename": "s3uploader_ci_cd-1.0.4.tar.gz",
"has_sig": false,
"md5_digest": "cbe59d4657ed092963a50b3d6d360142",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9,<4.0",
"size": 5542,
"upload_time": "2023-06-11T14:28:02",
"upload_time_iso_8601": "2023-06-11T14:28:02.170587Z",
"url": "https://files.pythonhosted.org/packages/af/b9/08c1bb361e2b82fda8a0accb7d264e3aff25e046902ce20d784dcf1c05e0/s3uploader_ci_cd-1.0.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-06-11 14:28:02",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Mellkori",
"github_project": "s3uploader",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "pip",
"specs": [
[
"==",
"22.3.1"
]
]
},
{
"name": "boto3",
"specs": [
[
"==",
"1.26.118"
]
]
},
{
"name": "coverage",
"specs": [
[
"==",
"7.2.3"
]
]
},
{
"name": "python_dotenv",
"specs": [
[
"==",
"1.0.0"
]
]
},
{
"name": "setuptools",
"specs": [
[
"==",
"67.7.2"
]
]
}
],
"lcname": "s3uploader-ci-cd"
}