dataflows-aws


Namedataflows-aws JSON
Version 0.2.4 PyPI version JSON
download
home_pagehttps://github.com/frictionlessdata/dataflows-aws
SummaryA utility library for working with Table Schema in Python
upload_time2023-08-24 10:15:11
maintainer
docs_urlNone
authorOpen Knowledge Foundation
requires_python
licenseMIT
keywords frictionless data open data json schema table schema data package tabular data package
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI
coveralls test coverage No coveralls.
            # dataflows-aws

[![Travis](https://travis-ci.org/frictionlessdata/dataflows-aws.svg?branch=master)](https://travis-ci.org/frictionlessdata/dataflows-aws)
[![Coveralls](http://img.shields.io/coveralls/frictionlessdata/dataflows-aws.svg?branch=master)](https://coveralls.io/r/frictionlessdata/dataflows-aws?branch=master)

Dataflows's processors to work with AWS

## Features

- `dump_to_s3` processor
- `change_acl_on_s3` processor

## Contents

<!--TOC-->

  - [Getting Started](#getting-started)
    - [Installation](#installation)
    - [Examples](#examples)
  - [Documentation](#documentation)
    - [dump_to_s3](#dump_to_s3)
    - [change_acl_on_s3](#change_acl_on_s3)
  - [Contributing](#contributing)
  - [Changelog](#changelog)

<!--TOC-->

## Getting Started

### Installation

The package use semantic versioning. It means that major versions  could include breaking changes. It's recommended to specify `package` version range in your `setup/requirements` file e.g. `package>=1.0,<2.0`.

```bash
$ pip install dataflows-aws
```

### Examples

These processors have to be used as a part of data flow. For example:

```python
flow = Flow(
    load('data/data.csv'),
    dump_to_s3(
        bucket=bucket,
        acl='private',
        path='my/datapackage',
        endpoint_url=os.environ['S3_ENDPOINT_URL'],
    ),
)
flow.process()
```

## Documentation

### dump_to_s3

Saves the DataPackage to AWS S3.

#### Parameters

- `bucket` - Name of the bucket where DataPackage will be stored (should already be created!)
- `acl` - ACL to provide the uploaded files. Default is 'public-read' (see [boto3 docs](http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.put_object) for more info).
- `path` - Path (key/prefix) to the DataPackage. May contain format string available for `datapackage.json` Eg: `my/example/path/{owner}/{name}/{version}`
- `content_type` - content type to use when storing files in S3. Defaults to text/plain (usual S3 default is binary/octet-stream but we prefer text/plain).
- `endpoint_url` - api endpoint to allow using S3 compatible services (e.g. 'https://ams3.digitaloceanspaces.com')

### change_acl_on_s3

Changes ACL of object in given Bucket with given path aka prefix.

#### Parameters

- `bucket` - Name of the bucket where objects are stored
- `acl` - Available options `'private'|'public-read'|'public-read-write'|'authenticated-read'|'aws-exec-read'|'bucket-owner-read'|'bucket-owner-full-control'`
- `path` - Path (key/prefix) to the DataPackage.
- `endpoint_url` - api endpoint to allow using S3 compatible services (e.g. 'https://ams3.digitaloceanspaces.com')

## Contributing

The project follows the [Open Knowledge International coding standards](https://github.com/okfn/coding-standards).

The recommended way to get started is to create and activate a project virtual environment.
To install package and development dependencies into your active environment:

```
$ make install
```

To run tests with linting and coverage:

```bash
$ make test
```

For linting, `pylama` (configured in `pylama.ini`) is used. At this stage it's already
installed into your environment and could be used separately with more fine-grained control
as described in documentation - https://pylama.readthedocs.io/en/latest/.

For example to sort results by error type:

```bash
$ pylama --sort <path>
```

For testing, `tox` (configured in `tox.ini`) is used.
It's already installed into your environment and could be used separately with more fine-grained control as described in documentation - https://testrun.org/tox/latest/.

For example to check subset of tests against Python 2 environment with increased verbosity.
All positional arguments and options after `--` will be passed to `py.test`:

```bash
tox -e py37 -- -v tests/<path>
```

Under the hood `tox` uses `pytest` (configured in `pytest.ini`), `coverage`
and `mock` packages. These packages are available only in tox envionments.

## Changelog

Here described only breaking and the most important changes. The full changelog and documentation for all released versions can be found in the nicely formatted [commit history](https://github.com/frictionlessdata/dataflows-aws/commits/master).

#### v0.x

- an initial processors implementation

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/frictionlessdata/dataflows-aws",
    "name": "dataflows-aws",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "frictionless data,open data,json schema,table schema,data package,tabular data package",
    "author": "Open Knowledge Foundation",
    "author_email": "info@okfn.org",
    "download_url": "https://files.pythonhosted.org/packages/c7/6e/878d7addaf86312fb36f885c5351522fd41326a8bdbd2432857f6d3b169c/dataflows-aws-0.2.4.tar.gz",
    "platform": null,
    "description": "# dataflows-aws\n\n[![Travis](https://travis-ci.org/frictionlessdata/dataflows-aws.svg?branch=master)](https://travis-ci.org/frictionlessdata/dataflows-aws)\n[![Coveralls](http://img.shields.io/coveralls/frictionlessdata/dataflows-aws.svg?branch=master)](https://coveralls.io/r/frictionlessdata/dataflows-aws?branch=master)\n\nDataflows's processors to work with AWS\n\n## Features\n\n- `dump_to_s3` processor\n- `change_acl_on_s3` processor\n\n## Contents\n\n<!--TOC-->\n\n  - [Getting Started](#getting-started)\n    - [Installation](#installation)\n    - [Examples](#examples)\n  - [Documentation](#documentation)\n    - [dump_to_s3](#dump_to_s3)\n    - [change_acl_on_s3](#change_acl_on_s3)\n  - [Contributing](#contributing)\n  - [Changelog](#changelog)\n\n<!--TOC-->\n\n## Getting Started\n\n### Installation\n\nThe package use semantic versioning. It means that major versions  could include breaking changes. It's recommended to specify `package` version range in your `setup/requirements` file e.g. `package>=1.0,<2.0`.\n\n```bash\n$ pip install dataflows-aws\n```\n\n### Examples\n\nThese processors have to be used as a part of data flow. For example:\n\n```python\nflow = Flow(\n    load('data/data.csv'),\n    dump_to_s3(\n        bucket=bucket,\n        acl='private',\n        path='my/datapackage',\n        endpoint_url=os.environ['S3_ENDPOINT_URL'],\n    ),\n)\nflow.process()\n```\n\n## Documentation\n\n### dump_to_s3\n\nSaves the DataPackage to AWS S3.\n\n#### Parameters\n\n- `bucket` - Name of the bucket where DataPackage will be stored (should already be created!)\n- `acl` - ACL to provide the uploaded files. Default is 'public-read' (see [boto3 docs](http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.put_object) for more info).\n- `path` - Path (key/prefix) to the DataPackage. May contain format string available for `datapackage.json` Eg: `my/example/path/{owner}/{name}/{version}`\n- `content_type` - content type to use when storing files in S3. Defaults to text/plain (usual S3 default is binary/octet-stream but we prefer text/plain).\n- `endpoint_url` - api endpoint to allow using S3 compatible services (e.g. 'https://ams3.digitaloceanspaces.com')\n\n### change_acl_on_s3\n\nChanges ACL of object in given Bucket with given path aka prefix.\n\n#### Parameters\n\n- `bucket` - Name of the bucket where objects are stored\n- `acl` - Available options `'private'|'public-read'|'public-read-write'|'authenticated-read'|'aws-exec-read'|'bucket-owner-read'|'bucket-owner-full-control'`\n- `path` - Path (key/prefix) to the DataPackage.\n- `endpoint_url` - api endpoint to allow using S3 compatible services (e.g. 'https://ams3.digitaloceanspaces.com')\n\n## Contributing\n\nThe project follows the [Open Knowledge International coding standards](https://github.com/okfn/coding-standards).\n\nThe recommended way to get started is to create and activate a project virtual environment.\nTo install package and development dependencies into your active environment:\n\n```\n$ make install\n```\n\nTo run tests with linting and coverage:\n\n```bash\n$ make test\n```\n\nFor linting, `pylama` (configured in `pylama.ini`) is used. At this stage it's already\ninstalled into your environment and could be used separately with more fine-grained control\nas described in documentation - https://pylama.readthedocs.io/en/latest/.\n\nFor example to sort results by error type:\n\n```bash\n$ pylama --sort <path>\n```\n\nFor testing, `tox` (configured in `tox.ini`) is used.\nIt's already installed into your environment and could be used separately with more fine-grained control as described in documentation - https://testrun.org/tox/latest/.\n\nFor example to check subset of tests against Python 2 environment with increased verbosity.\nAll positional arguments and options after `--` will be passed to `py.test`:\n\n```bash\ntox -e py37 -- -v tests/<path>\n```\n\nUnder the hood `tox` uses `pytest` (configured in `pytest.ini`), `coverage`\nand `mock` packages. These packages are available only in tox envionments.\n\n## Changelog\n\nHere described only breaking and the most important changes. The full changelog and documentation for all released versions can be found in the nicely formatted [commit history](https://github.com/frictionlessdata/dataflows-aws/commits/master).\n\n#### v0.x\n\n- an initial processors implementation\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A utility library for working with Table Schema in Python",
    "version": "0.2.4",
    "project_urls": {
        "Homepage": "https://github.com/frictionlessdata/dataflows-aws"
    },
    "split_keywords": [
        "frictionless data",
        "open data",
        "json schema",
        "table schema",
        "data package",
        "tabular data package"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d38771adc1fad3f2d10cdce95cf100f4fe54cc43eb8d2b4233e36f9cf375a393",
                "md5": "9c5daf923d7817171ec8adcc4f891927",
                "sha256": "7b151cffbe88feeb3bb2fab75c9f12a69d7a9718d0b493475332f052c2b396a7"
            },
            "downloads": -1,
            "filename": "dataflows_aws-0.2.4-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9c5daf923d7817171ec8adcc4f891927",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": null,
            "size": 7066,
            "upload_time": "2023-08-24T10:15:09",
            "upload_time_iso_8601": "2023-08-24T10:15:09.338303Z",
            "url": "https://files.pythonhosted.org/packages/d3/87/71adc1fad3f2d10cdce95cf100f4fe54cc43eb8d2b4233e36f9cf375a393/dataflows_aws-0.2.4-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c76e878d7addaf86312fb36f885c5351522fd41326a8bdbd2432857f6d3b169c",
                "md5": "91269e04a0abc382754811dd83b141c9",
                "sha256": "2671e9bd6fbee538e18aa4fa7738d1b298c2339be65c401ac9c6ff2709ca3b0b"
            },
            "downloads": -1,
            "filename": "dataflows-aws-0.2.4.tar.gz",
            "has_sig": false,
            "md5_digest": "91269e04a0abc382754811dd83b141c9",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 8654,
            "upload_time": "2023-08-24T10:15:11",
            "upload_time_iso_8601": "2023-08-24T10:15:11.192643Z",
            "url": "https://files.pythonhosted.org/packages/c7/6e/878d7addaf86312fb36f885c5351522fd41326a8bdbd2432857f6d3b169c/dataflows-aws-0.2.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-24 10:15:11",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "frictionlessdata",
    "github_project": "dataflows-aws",
    "travis_ci": true,
    "coveralls": false,
    "github_actions": false,
    "tox": true,
    "lcname": "dataflows-aws"
}
        
Elapsed time: 2.54959s