pytest-docker-compose-v2


Namepytest-docker-compose-v2 JSON
Version 0.1.1 PyPI version JSON
download
home_page
SummaryManages Docker containers during your integration tests
upload_time2024-02-28 18:29:19
maintainer
docs_urlNone
authorRadu Suciu,Roald Storm,Thomas Meckel
requires_python>=3.9
licenseApache
keywords pytest docker compose
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![PyPI pyversions](https://img.shields.io/pypi/pyversions/pytest-docker-compose-v2.svg)](https://pypi.python.org/pypi/pytest-docker-compose-v2/)
[![PyPI version](https://img.shields.io/pypi/v/pytest-docker-compose-v2.svg)](https://pypi.python.org/pypi/pytest-docker-compose-v2/)
[![GitHub release](https://img.shields.io/github/release/radusuciu/pytest-docker-compose-v2.svg)](https://github.com/radusuciu/pytest-docker-compose-v2/releases/)

# pytest-docker-compose-v2

**NOTE**: This is a fork from the [`pytest-docker-compose`](https://github.com/pytest-docker-compose/pytest-docker-compose) project which at the time of writing hasn't been updated in 2 years -- I depend on this so I'm taking a crack at running a -v2 fork, but I'm definitely willing to re-integrate any changes back to the original project. Currently the changes are pretty basic and include [a PR that introduces support for docker compose v2](https://github.com/pytest-docker-compose/pytest-docker-compose/pull/98/), some package updates, and more superficial updates to align with personal preferences. Though I am making an effort to run the existing test suite, I have not tested this extensively.

This package contains a `pytest` plugin for integrating Docker Compose into your automated integration tests.

Given a path to a `docker-compose.yml` file, it will automatically build the project at the start of the test run, bring the containers up before each test starts, and tear them down after each test ends.


## Dependencies

Make sure you have [Docker](https://docs.docker.com/get-docker/) installed.

This plugin is automatically tested against the following software:

- Python 3.9, 3.10, 3.11 and 3.12
- pytest 7

**NOTE**: This plugin is **not** compatible with Python 2.


## Installation

Install the plugin using pip:

```shell
pip install pytest-docker-compose-v2
```

## Usage

For performance reasons, the plugin is not enabled by default, so you must activate it manually in the tests that use it:

```python
pytest_plugins = ["docker_compose"]
```


See [Installing and Using Plugins](https://docs.pytest.org/en/latest/plugins.html#requiring-loading-plugins-in-a-test-module-or-conftest-file) for more information.

To interact with Docker containers in your tests, use the following fixtures, these fixtures tell docker-compose to start all the services and then they can fetch the associated containers for use in a test:

### `function_scoped_container_getter`

An object that fetches containers of the Docker `compose.container.Container` objects running during the test. The containers are fetched using `function_scoped_container_getter.get('service_name')` These containers each have an extra attribute called `network_info` added to them. This attribute has a list of `pytest_docker_compose.NetworkInfo` objects.

This information can be used to configure API clients and other objects that will connect to services exposed by the Docker containers in your tests.

`NetworkInfo` is a container with the following fields:

- `container_port`: The port (and usually also protocol name) exposed
    internally to the container.  You can use this value to find the correct
    port for your test, when the container exposes multiple ports.
- `hostname`: The hostname (usually "localhost") to use when connecting to
    the service from the host.
- `host_port`: The port number to use when connecting to the service from
    the host.

### `docker_project`

The `compose.project.Project` object that the containers are built from.
This fixture is generally only used internally by the plugin.

### Wider scoped fixtures

To use the following fixtures please read [Use wider scoped fixtures](#use-wider-scope-fixtures)

- `class_scoped_container_getter`: Similar to `function_scoped_container_getter` just with a wider scope.
- `module_scoped_container_getter`: Similar to `function_scoped_container_getter` just with a wider scope.
- `session_scoped_container_getter`: Similar to `function_scoped_container_getter` just with a wider scope.

### Waiting for Services to Come Online

The fixtures called `[scope]_scoped_container_getter` will wait until every container is up before handing control over to the test.

However, just because a container is up does not mean that the services running on it are ready to accept incoming requests yet!

If your tests need to wait for a particular condition (for example, to wait for an HTTP health check endpoint to send back a 200 response), make sure that your fixtures account for this.

Here's an example of a fixture called `wait_for_api` that waits for an HTTP service to come online before a test called `test_read_and_write` can run.

```python
import pytest
import requests
from urllib.parse import urljoin
from urllib3.util.retry import Retry
from requests.adapters import HTTPAdapter

pytest_plugins = ["docker_compose"]

# Invoking this fixture: 'function_scoped_container_getter' starts all services
@pytest.fixture(scope="function")
def wait_for_api(function_scoped_container_getter):
    """Wait for the api from my_api_service to become responsive"""
    request_session = requests.Session()
    retries = Retry(total=5,
                    backoff_factor=0.1,
                    status_forcelist=[500, 502, 503, 504])
    request_session.mount('http://', HTTPAdapter(max_retries=retries))

    service = function_scoped_container_getter.get("my_api_service").network_info[0]
    api_url = "http://%s:%s/" % (service.hostname, service.host_port)
    assert request_session.get(api_url)
    return request_session, api_url


def test_read_and_write(wait_for_api):
    """The Api is now verified good to go and tests can interact with it"""
    request_session, api_url = wait_for_api
    data_string = 'some_data'
    request_session.put('%sitems/2?data_string=%s' % (api_url, data_string))
    item = request_session.get(urljoin(api_url, 'items/2')).json()
    assert item['data'] == data_string
    request_session.delete(urljoin(api_url, 'items/2'))
```

## Use wider scoped fixtures

The `function_scoped_container_getter` fixture uses "function" scope, meaning that all of the containers are torn down after each individual test.

This is done so that every test gets to run in a "clean" environment. However, this can potentially make a test suite take a very long time to complete.

There are two options to make containers persist beyond a single test. The best way is to use the fixtures that are explicitly scoped to different scopes. There are three additional fixtures for this purpose: `class_scoped_container_getter`, `module_scoped_container_getter` and `session_scoped_container_getter`. Notice that you need to be careful when using these! There are two main caveats to keep in mind:

1. Manage your scope correctly, using 'module' scope and 'function' scope in one single file will throw an error! This is because the module scoped fixture will spin up the containers and then the function scoped fixture will try to spin up the containers again. Docker compose does not allow you to spin up containers twice.
2. Clean up your environment after each test. Because the containers are not restarted their environments can carry the information from previous tests. Therefore you need to be very careful when designing your tests such that they leave the containers in the same state that it started in or you might run into difficult to understand behaviour.

A second method to make containers persist beyond a single test is to supply the `--use-running-containers` flag to pytest like so:

```shell
pytest --use-running-containers
```

With this flag, `pytest-docker-compose` checks that all containers are running
during the project creation. If they are not running a warning is given and
they are spun up anyways. They are then used for all the tests and NOT TORE
DOWN afterwards.

This mode is best used in combination with the `--docker-compose-no-build` flag since the newly build containers won't be used anyways. like so:

```shell
pytest --docker-compose-no-build --use-running-containers
```

It is of course possible to add these options to `pytest.ini` or `pyproject.toml`.

Notice that for this mode the scoping of the fixtures becomes less important since the containers are fully persistent throughout all tests. I only recommend using this if your network takes excessively long to spin up/tear down. It should really be a last resort and you should probably look into speeding up your network instead of using this.


## Running Integration Tests

Use `pytest` to run your tests as normal:

```shell
pytest
```

By default, this will look for a `docker-compose.yml` file in the current
working directory.  You can specify a different file via the
`--docker-compose` option:

```shell
pytest --docker-compose=/path/to/docker-compose.yml
```

Docker compose allows for specifying multiple compose files as described [in the docs](https://docs.docker.com/compose/extends). To specify more than one compose file, separate them with a `,`:

```shell
pytest --docker-compose=/path/to/docker-compose.yml,/another/docker-compose.yml,/third/docker-compose.yml
```


### Tip

Alternatively, you can specify this option in your `pytest.ini` file:

```ini
[pytest]
addopts = --docker-compose=/path/to/docker-compose.yml
```


The option will be ignored for tests that do not use this plugin.

See [Configuration Options](https://docs.pytest.org/en/latest/customize.html#adding-default-options) for more information on using configuration
files to modify pytest behavior.

### Remove volumes after tests

There is another configuration option that will delete the volumes of containers after running.

```shell
pytest --docker-compose-remove-volumes
```

This option will be ignored if the plugin is not used. Again, this option can also be added to the `pytest.ini` file.

For more examples on how to use this plugin look at the testing suite of this plugin itself! It will give you some examples for configuring `pyproject.toml` and how to use the different fixtures to run docker containers.



            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "pytest-docker-compose-v2",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "",
    "keywords": "pytest,docker compose",
    "author": "Radu Suciu,Roald Storm,Thomas Meckel",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/94/4f/cfb4c769a65e6c38175cd86d04badcf432a3fa3dff9ef84ad442d57f2749/pytest-docker-compose-v2-0.1.1.tar.gz",
    "platform": null,
    "description": "[![PyPI pyversions](https://img.shields.io/pypi/pyversions/pytest-docker-compose-v2.svg)](https://pypi.python.org/pypi/pytest-docker-compose-v2/)\n[![PyPI version](https://img.shields.io/pypi/v/pytest-docker-compose-v2.svg)](https://pypi.python.org/pypi/pytest-docker-compose-v2/)\n[![GitHub release](https://img.shields.io/github/release/radusuciu/pytest-docker-compose-v2.svg)](https://github.com/radusuciu/pytest-docker-compose-v2/releases/)\n\n# pytest-docker-compose-v2\n\n**NOTE**: This is a fork from the [`pytest-docker-compose`](https://github.com/pytest-docker-compose/pytest-docker-compose) project which at the time of writing hasn't been updated in 2 years -- I depend on this so I'm taking a crack at running a -v2 fork, but I'm definitely willing to re-integrate any changes back to the original project. Currently the changes are pretty basic and include [a PR that introduces support for docker compose v2](https://github.com/pytest-docker-compose/pytest-docker-compose/pull/98/), some package updates, and more superficial updates to align with personal preferences. Though I am making an effort to run the existing test suite, I have not tested this extensively.\n\nThis package contains a `pytest` plugin for integrating Docker Compose into your automated integration tests.\n\nGiven a path to a `docker-compose.yml` file, it will automatically build the project at the start of the test run, bring the containers up before each test starts, and tear them down after each test ends.\n\n\n## Dependencies\n\nMake sure you have [Docker](https://docs.docker.com/get-docker/) installed.\n\nThis plugin is automatically tested against the following software:\n\n- Python 3.9, 3.10, 3.11 and 3.12\n- pytest 7\n\n**NOTE**: This plugin is **not** compatible with Python 2.\n\n\n## Installation\n\nInstall the plugin using pip:\n\n```shell\npip install pytest-docker-compose-v2\n```\n\n## Usage\n\nFor performance reasons, the plugin is not enabled by default, so you must activate it manually in the tests that use it:\n\n```python\npytest_plugins = [\"docker_compose\"]\n```\n\n\nSee [Installing and Using Plugins](https://docs.pytest.org/en/latest/plugins.html#requiring-loading-plugins-in-a-test-module-or-conftest-file) for more information.\n\nTo interact with Docker containers in your tests, use the following fixtures, these fixtures tell docker-compose to start all the services and then they can fetch the associated containers for use in a test:\n\n### `function_scoped_container_getter`\n\nAn object that fetches containers of the Docker `compose.container.Container` objects running during the test. The containers are fetched using `function_scoped_container_getter.get('service_name')` These containers each have an extra attribute called `network_info` added to them. This attribute has a list of `pytest_docker_compose.NetworkInfo` objects.\n\nThis information can be used to configure API clients and other objects that will connect to services exposed by the Docker containers in your tests.\n\n`NetworkInfo` is a container with the following fields:\n\n- `container_port`: The port (and usually also protocol name) exposed\n    internally to the container.  You can use this value to find the correct\n    port for your test, when the container exposes multiple ports.\n- `hostname`: The hostname (usually \"localhost\") to use when connecting to\n    the service from the host.\n- `host_port`: The port number to use when connecting to the service from\n    the host.\n\n### `docker_project`\n\nThe `compose.project.Project` object that the containers are built from.\nThis fixture is generally only used internally by the plugin.\n\n### Wider scoped fixtures\n\nTo use the following fixtures please read [Use wider scoped fixtures](#use-wider-scope-fixtures)\n\n- `class_scoped_container_getter`: Similar to `function_scoped_container_getter` just with a wider scope.\n- `module_scoped_container_getter`: Similar to `function_scoped_container_getter` just with a wider scope.\n- `session_scoped_container_getter`: Similar to `function_scoped_container_getter` just with a wider scope.\n\n### Waiting for Services to Come Online\n\nThe fixtures called `[scope]_scoped_container_getter` will wait until every container is up before handing control over to the test.\n\nHowever, just because a container is up does not mean that the services running on it are ready to accept incoming requests yet!\n\nIf your tests need to wait for a particular condition (for example, to wait for an HTTP health check endpoint to send back a 200 response), make sure that your fixtures account for this.\n\nHere's an example of a fixture called `wait_for_api` that waits for an HTTP service to come online before a test called `test_read_and_write` can run.\n\n```python\nimport pytest\nimport requests\nfrom urllib.parse import urljoin\nfrom urllib3.util.retry import Retry\nfrom requests.adapters import HTTPAdapter\n\npytest_plugins = [\"docker_compose\"]\n\n# Invoking this fixture: 'function_scoped_container_getter' starts all services\n@pytest.fixture(scope=\"function\")\ndef wait_for_api(function_scoped_container_getter):\n    \"\"\"Wait for the api from my_api_service to become responsive\"\"\"\n    request_session = requests.Session()\n    retries = Retry(total=5,\n                    backoff_factor=0.1,\n                    status_forcelist=[500, 502, 503, 504])\n    request_session.mount('http://', HTTPAdapter(max_retries=retries))\n\n    service = function_scoped_container_getter.get(\"my_api_service\").network_info[0]\n    api_url = \"http://%s:%s/\" % (service.hostname, service.host_port)\n    assert request_session.get(api_url)\n    return request_session, api_url\n\n\ndef test_read_and_write(wait_for_api):\n    \"\"\"The Api is now verified good to go and tests can interact with it\"\"\"\n    request_session, api_url = wait_for_api\n    data_string = 'some_data'\n    request_session.put('%sitems/2?data_string=%s' % (api_url, data_string))\n    item = request_session.get(urljoin(api_url, 'items/2')).json()\n    assert item['data'] == data_string\n    request_session.delete(urljoin(api_url, 'items/2'))\n```\n\n## Use wider scoped fixtures\n\nThe `function_scoped_container_getter` fixture uses \"function\" scope, meaning that all of the containers are torn down after each individual test.\n\nThis is done so that every test gets to run in a \"clean\" environment. However, this can potentially make a test suite take a very long time to complete.\n\nThere are two options to make containers persist beyond a single test. The best way is to use the fixtures that are explicitly scoped to different scopes. There are three additional fixtures for this purpose: `class_scoped_container_getter`, `module_scoped_container_getter` and `session_scoped_container_getter`. Notice that you need to be careful when using these! There are two main caveats to keep in mind:\n\n1. Manage your scope correctly, using 'module' scope and 'function' scope in one single file will throw an error! This is because the module scoped fixture will spin up the containers and then the function scoped fixture will try to spin up the containers again. Docker compose does not allow you to spin up containers twice.\n2. Clean up your environment after each test. Because the containers are not restarted their environments can carry the information from previous tests. Therefore you need to be very careful when designing your tests such that they leave the containers in the same state that it started in or you might run into difficult to understand behaviour.\n\nA second method to make containers persist beyond a single test is to supply the `--use-running-containers` flag to pytest like so:\n\n```shell\npytest --use-running-containers\n```\n\nWith this flag, `pytest-docker-compose` checks that all containers are running\nduring the project creation. If they are not running a warning is given and\nthey are spun up anyways. They are then used for all the tests and NOT TORE\nDOWN afterwards.\n\nThis mode is best used in combination with the `--docker-compose-no-build` flag since the newly build containers won't be used anyways. like so:\n\n```shell\npytest --docker-compose-no-build --use-running-containers\n```\n\nIt is of course possible to add these options to `pytest.ini` or `pyproject.toml`.\n\nNotice that for this mode the scoping of the fixtures becomes less important since the containers are fully persistent throughout all tests. I only recommend using this if your network takes excessively long to spin up/tear down. It should really be a last resort and you should probably look into speeding up your network instead of using this.\n\n\n## Running Integration Tests\n\nUse `pytest` to run your tests as normal:\n\n```shell\npytest\n```\n\nBy default, this will look for a `docker-compose.yml` file in the current\nworking directory.  You can specify a different file via the\n`--docker-compose` option:\n\n```shell\npytest --docker-compose=/path/to/docker-compose.yml\n```\n\nDocker compose allows for specifying multiple compose files as described [in the docs](https://docs.docker.com/compose/extends). To specify more than one compose file, separate them with a `,`:\n\n```shell\npytest --docker-compose=/path/to/docker-compose.yml,/another/docker-compose.yml,/third/docker-compose.yml\n```\n\n\n### Tip\n\nAlternatively, you can specify this option in your `pytest.ini` file:\n\n```ini\n[pytest]\naddopts = --docker-compose=/path/to/docker-compose.yml\n```\n\n\nThe option will be ignored for tests that do not use this plugin.\n\nSee [Configuration Options](https://docs.pytest.org/en/latest/customize.html#adding-default-options) for more information on using configuration\nfiles to modify pytest behavior.\n\n### Remove volumes after tests\n\nThere is another configuration option that will delete the volumes of containers after running.\n\n```shell\npytest --docker-compose-remove-volumes\n```\n\nThis option will be ignored if the plugin is not used. Again, this option can also be added to the `pytest.ini` file.\n\nFor more examples on how to use this plugin look at the testing suite of this plugin itself! It will give you some examples for configuring `pyproject.toml` and how to use the different fixtures to run docker containers.\n\n\n",
    "bugtrack_url": null,
    "license": "Apache",
    "summary": "Manages Docker containers during your integration tests",
    "version": "0.1.1",
    "project_urls": {
        "repository": "https://github.com/radusuciu/pytest-docker-compose-v2"
    },
    "split_keywords": [
        "pytest",
        "docker compose"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a6fe9b0c19d19336f49f73469b348c68b4faedf6bf199586a4f2e5922ea00a96",
                "md5": "6a2d400b47199d9f6a7442b9d2f89edb",
                "sha256": "041c3fd1d007be280fa547125ec1c30c64be1a0cb8c1189eba743b063b749a45"
            },
            "downloads": -1,
            "filename": "pytest_docker_compose_v2-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6a2d400b47199d9f6a7442b9d2f89edb",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 12802,
            "upload_time": "2024-02-28T18:29:17",
            "upload_time_iso_8601": "2024-02-28T18:29:17.829351Z",
            "url": "https://files.pythonhosted.org/packages/a6/fe/9b0c19d19336f49f73469b348c68b4faedf6bf199586a4f2e5922ea00a96/pytest_docker_compose_v2-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "944fcfb4c769a65e6c38175cd86d04badcf432a3fa3dff9ef84ad442d57f2749",
                "md5": "22d039537e3aab6a90178e50ec47b7e4",
                "sha256": "dea59e5b87a3e28b052005bb933f94125fd64efb101dff419686b536b428ad60"
            },
            "downloads": -1,
            "filename": "pytest-docker-compose-v2-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "22d039537e3aab6a90178e50ec47b7e4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 17924,
            "upload_time": "2024-02-28T18:29:19",
            "upload_time_iso_8601": "2024-02-28T18:29:19.406954Z",
            "url": "https://files.pythonhosted.org/packages/94/4f/cfb4c769a65e6c38175cd86d04badcf432a3fa3dff9ef84ad442d57f2749/pytest-docker-compose-v2-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-28 18:29:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "radusuciu",
    "github_project": "pytest-docker-compose-v2",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "pytest-docker-compose-v2"
}
        
Elapsed time: 0.22250s