airbyte-source-amazon-sqs


Nameairbyte-source-amazon-sqs JSON
Version 0.1.1 PyPI version JSON
download
home_page
SummarySource implementation for Amazon Sqs.
upload_time2024-01-30 13:19:41
maintainer
docs_urlNone
authorAirbyte
requires_python
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Amazon Sqs Source

This is the repository for the Amazon Sqs source connector, written in Python.
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/amazon-sqs).


**To iterate on this connector, make sure to complete this prerequisites section.**


From this connector directory, create a virtual environment:
```
python -m venv .venv
```

This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:
```
source .venv/bin/activate
pip install -r requirements.txt
```
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.

Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`.
If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything
should work as you expect.

**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/amazon-sqs)
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_amazon_sqs/spec.json` file.
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
See `integration_tests/sample_config.json` for a sample config file.

**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source amazon-sqs test creds`
and place them into `secrets/config.json`.

```
python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
```



**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
```bash
airbyte-ci connectors --name=source-amazon-sqs build
```

An image will be built with the tag `airbyte/source-amazon-sqs:dev`.

**Via `docker build`:**
```bash
docker build -t airbyte/source-amazon-sqs:dev .
```

Then run any of the connector commands as follows:
```
docker run --rm airbyte/source-amazon-sqs:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-sqs:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-sqs:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-amazon-sqs:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
```

You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
```bash
airbyte-ci connectors --name=source-amazon-sqs test
```

Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.

All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
* required for the testing need to go to `TEST_REQUIREMENTS` list

You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-amazon-sqs test`
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
3. Make sure the `metadata.yaml` content is up to date.
4. Make the connector documentation and its changelog is up to date (`docs/integrations/sources/amazon-sqs.md`).
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
6. Pat yourself on the back for being an awesome contributor.
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "airbyte-source-amazon-sqs",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "Airbyte",
    "author_email": "contact@airbyte.io",
    "download_url": "https://files.pythonhosted.org/packages/a3/6b/4294c1a11ad599f9bbfc3fc5bcd487dbb7deac45b9f393009d8aff5a06f6/airbyte-source-amazon-sqs-0.1.1.tar.gz",
    "platform": null,
    "description": "# Amazon Sqs Source\n\nThis is the repository for the Amazon Sqs source connector, written in Python.\nFor information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/amazon-sqs).\n\n\n**To iterate on this connector, make sure to complete this prerequisites section.**\n\n\nFrom this connector directory, create a virtual environment:\n```\npython -m venv .venv\n```\n\nThis will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your\ndevelopment environment of choice. To activate it from the terminal, run:\n```\nsource .venv/bin/activate\npip install -r requirements.txt\n```\nIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.\n\nNote that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is\nused for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`.\nIf this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything\nshould work as you expect.\n\n**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/amazon-sqs)\nto generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_amazon_sqs/spec.json` file.\nNote that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.\nSee `integration_tests/sample_config.json` for a sample config file.\n\n**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source amazon-sqs test creds`\nand place them into `secrets/config.json`.\n\n```\npython main.py spec\npython main.py check --config secrets/config.json\npython main.py discover --config secrets/config.json\npython main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json\n```\n\n\n\n**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**\n```bash\nairbyte-ci connectors --name=source-amazon-sqs build\n```\n\nAn image will be built with the tag `airbyte/source-amazon-sqs:dev`.\n\n**Via `docker build`:**\n```bash\ndocker build -t airbyte/source-amazon-sqs:dev .\n```\n\nThen run any of the connector commands as follows:\n```\ndocker run --rm airbyte/source-amazon-sqs:dev spec\ndocker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-sqs:dev check --config /secrets/config.json\ndocker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-sqs:dev discover --config /secrets/config.json\ndocker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-amazon-sqs:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json\n```\n\nYou can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):\n```bash\nairbyte-ci connectors --name=source-amazon-sqs test\n```\n\nCustomize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.\nIf your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.\n\nAll of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.\nWe split dependencies between two groups, dependencies that are:\n* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.\n* required for the testing need to go to `TEST_REQUIREMENTS` list\n\nYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?\n1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-amazon-sqs test`\n2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).\n3. Make sure the `metadata.yaml` content is up to date.\n4. Make the connector documentation and its changelog is up to date (`docs/integrations/sources/amazon-sqs.md`).\n5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).\n6. Pat yourself on the back for being an awesome contributor.\n7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Source implementation for Amazon Sqs.",
    "version": "0.1.1",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "41c0d7f6ea1e39b82d45da2e510e1502fb66dc2e9f61cd8926cfe3095b8488c9",
                "md5": "1ba7e013bb4394d3ffc2478e8bdea4f5",
                "sha256": "0283ad9ce130a851264d26957939ddb4ecf192322e759e7261338427159d23d0"
            },
            "downloads": -1,
            "filename": "airbyte_source_amazon_sqs-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1ba7e013bb4394d3ffc2478e8bdea4f5",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 10315,
            "upload_time": "2024-01-30T13:19:39",
            "upload_time_iso_8601": "2024-01-30T13:19:39.633029Z",
            "url": "https://files.pythonhosted.org/packages/41/c0/d7f6ea1e39b82d45da2e510e1502fb66dc2e9f61cd8926cfe3095b8488c9/airbyte_source_amazon_sqs-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a36b4294c1a11ad599f9bbfc3fc5bcd487dbb7deac45b9f393009d8aff5a06f6",
                "md5": "fb9356f21bbcd23d16f629a7361a6259",
                "sha256": "91c9de32ec44667d0edba36404751c42f57bc9b84f84ae4f8bdf5e757d7096d3"
            },
            "downloads": -1,
            "filename": "airbyte-source-amazon-sqs-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "fb9356f21bbcd23d16f629a7361a6259",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 9275,
            "upload_time": "2024-01-30T13:19:41",
            "upload_time_iso_8601": "2024-01-30T13:19:41.265041Z",
            "url": "https://files.pythonhosted.org/packages/a3/6b/4294c1a11ad599f9bbfc3fc5bcd487dbb7deac45b9f393009d8aff5a06f6/airbyte-source-amazon-sqs-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-30 13:19:41",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "airbyte-source-amazon-sqs"
}
        
Elapsed time: 2.29792s