pytest-postgresql


Namepytest-postgresql JSON
Version 7.0.0 PyPI version JSON
download
home_pageNone
SummaryPostgresql fixtures and fixture factories for Pytest.
upload_time2025-02-23 15:01:14
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords tests pytest fixture postgresql
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            .. image:: https://raw.githubusercontent.com/dbfixtures/pytest-postgresql/master/logo.png
    :width: 100px
    :height: 100px

pytest-postgresql
=================

.. image:: https://img.shields.io/pypi/v/pytest-postgresql.svg
    :target: https://pypi.python.org/pypi/pytest-postgresql/
    :alt: Latest PyPI version

.. image:: https://img.shields.io/pypi/wheel/pytest-postgresql.svg
    :target: https://pypi.python.org/pypi/pytest-postgresql/
    :alt: Wheel Status

.. image:: https://img.shields.io/pypi/pyversions/pytest-postgresql.svg
    :target: https://pypi.python.org/pypi/pytest-postgresql/
    :alt: Supported Python Versions

.. image:: https://img.shields.io/pypi/l/pytest-postgresql.svg
    :target: https://pypi.python.org/pypi/pytest-postgresql/
    :alt: License

What is this?
=============

This is a pytest plugin, that enables you to test your code that relies on a running PostgreSQL Database.
It allows you to specify fixtures for PostgreSQL process and client.

How to use
==========

.. warning::

    Tested on PostgreSQL versions >= 10. See tests for more details.

Install with:

.. code-block:: sh

    pip install pytest-postgresql

You will also need to install ``psycopg``. See `its installation instructions <https://www.psycopg.org/psycopg3/docs/basic/install.html>`_.
Note that this plugin requires ``psycopg`` version 3. It is possible to simultaneously install version 3
and version 2 for libraries that require the latter (see `those instructions <https://www.psycopg.org/docs/install.html>`_).

Plugin contains three fixtures:

* **postgresql** - it's a client fixture that has functional scope.
  After each test it ends all leftover connections, and drops test database
  from PostgreSQL ensuring repeatability.
  This fixture returns already connected psycopg connection.

* **postgresql_proc** - session scoped fixture, that starts PostgreSQL instance
  at it's first use and stops at the end of the tests.
* **postgresql_noproc** - a noprocess fixture, that's connecting to already
  running postgresql instance.
  For example on dockerized test environments, or CI providing postgresql services

Simply include one of these fixtures into your tests fixture list.

You can also create additional postgresql client and process fixtures if you'd need to:


.. code-block:: python

    from pytest_postgresql import factories

    postgresql_my_proc = factories.postgresql_proc(
        port=None, unixsocketdir='/var/run')
    postgresql_my = factories.postgresql('postgresql_my_proc')

.. note::

    Each PostgreSQL process fixture can be configured in a different way than the others through the fixture factory arguments.

Sample test

.. code-block:: python

    def test_example_postgres(postgresql):
        """Check main postgresql fixture."""
        cur = postgresql.cursor()
        cur.execute("CREATE TABLE test (id serial PRIMARY KEY, num integer, data varchar);")
        postgresql.commit()
        cur.close()

Pre-populating the database for tests
-------------------------------------

If you want the database fixture to be automatically pre-populated with your schema and data, there are two lewels you can achieve it:

#. per test in a client fixture, by an intermediary fixture between client and your test (or other fixtures)
#. per session in a process fixture

The process fixture accepts a load parameter, which accepts these loaders:

* sql file path - which will load and execute sql files
* loading functions - either by string import path, actual callable.
  Loading functions will receive **host**, **port**, **user**, **dbname** and **password** arguments and will have to perform
  connection to the database inside. Or start session in the ORM of your choice to perform actions with given ORM.
  This way, you'd be able to trigger ORM based data manipulations, or even trigger database migrations programmatically.

The process fixture pre-populates the database once per test session (at the start of the process fixture),
and loads the schema and data into the template database. Client fixture then creates test database out of the template database each test,
which significantly **speeds up the tests**.

.. code-block:: python

    from pathlib import Path
    postgresql_my_proc = factories.postgresql_proc(
        load=[Path("schemafile.sql"), Path("otherschema.sql"), "import.path.to.function", "import.path.to:otherfunction", load_this]
    )

Additional benefit, is that test code might safely use separate database connection, and can safely test it's behaviour with transactions and rollbacks,
as tests and code will work on separate database connections.

Defining pre-populate on command line:

.. code-block:: sh

    pytest --postgresql-populate-template=path.to.loading_function --postgresql-populate-template=path.to.other:loading_function --postgresql-populate-template=path/to/file.sql

Connecting to already existing postgresql database
--------------------------------------------------

Some projects are using already running postgresql servers (ie on docker instances).
In order to connect to them, one would be using the ``postgresql_noproc`` fixture.

.. code-block:: python

    postgresql_external = factories.postgresql('postgresql_noproc')

By default the  ``postgresql_noproc`` fixture would connect to postgresql instance using **5432** port. Standard configuration options apply to it.

These are the configuration options that are working on all levels with the ``postgresql_noproc`` fixture:

Configuration
=============

You can define your settings in three ways, it's fixture factory argument, command line option and pytest.ini configuration option.
You can pick which you prefer, but remember that these settings are handled in the following order:

    * ``Fixture factory argument``
    * ``Command line option``
    * ``Configuration option in your pytest.ini file``


.. list-table:: Configuration options
   :header-rows: 1

   * - PostgreSQL option
     - Fixture factory argument
     - Command line option
     - pytest.ini option
     - Noop process fixture
     - Default
   * - Path to executable
     - executable
     - --postgresql-exec
     - postgresql_exec
     - -
     - /usr/lib/postgresql/13/bin/pg_ctl
   * - host
     - host
     - --postgresql-host
     - postgresql_host
     - yes
     - 127.0.0.1
   * - port
     - port
     - --postgresql-port
     - postgresql_port
     - yes (5432)
     - random
   * - Port search count
     -
     - --postgresql-port-search-count
     - postgresql_port_search_count
     - -
     - 5
   * - postgresql user
     - user
     - --postgresql-user
     - postgresql_user
     - yes
     - postgres
   * - password
     - password
     - --postgresql-password
     - postgresql_password
     - yes
     -
   * - Starting parameters (extra pg_ctl arguments)
     - startparams
     - --postgresql-startparams
     - postgresql_startparams
     - -
     - -w
   * - Postgres exe extra arguments (passed via pg_ctl's -o argument)
     - postgres_options
     - --postgresql-postgres-options
     - postgresql_postgres_options
     - -
     -
   * - Location for unixsockets
     - unixsocket
     - --postgresql-unixsocketdir
     - postgresql_unixsocketdir
     - -
     - $TMPDIR
   * - Database name which will be created by the fixtures
     - dbname
     - --postgresql-dbname
     - postgresql_dbname
     - yes, however with xdist an index is being added to name, resulting in test0, test1 for each worker.
     - test
   * - Default Schema either in sql files or import path to function that will load it (list of values for each)
     - load
     - --postgresql-load
     - postgresql_load
     - yes
     -
   * - PostgreSQL connection options
     - options
     - --postgresql-options
     - postgresql_options
     - yes
     -
   * - Drop test database on start.

       .. warning::

           Use carefully as it might lead to unexpected results within your test suite.
     -
     - --postgresql-drop-test-database
     -
     - false
     -




Example usage:

* pass it as an argument in your own fixture

    .. code-block:: python

        postgresql_proc = factories.postgresql_proc(
            port=8888)

* use ``--postgresql-port`` command line option when you run your tests

    .. code-block:: sh

        py.test tests --postgresql-port=8888


* specify your port as ``postgresql_port`` in your ``pytest.ini`` file.

    To do so, put a line like the following under the ``[pytest]`` section of your ``pytest.ini``:

    .. code-block:: ini

        [pytest]
        postgresql_port = 8888

Examples
========

Populating database for tests
-----------------------------

With SQLAlchemy
+++++++++++++++

This example shows how to populate database and create an SQLAlchemy's ORM connection:

Sample below is simplified session fixture from
`pyramid_fullauth <https://github.com/fizyk/pyramid_fullauth/>`_ tests:

.. code-block:: python

    from sqlalchemy import create_engine
    from sqlalchemy.orm import scoped_session, sessionmaker
    from sqlalchemy.pool import NullPool
    from zope.sqlalchemy import register


    @pytest.fixture
    def db_session(postgresql):
        """Session for SQLAlchemy."""
        from pyramid_fullauth.models import Base

        connection = f'postgresql+psycopg2://{postgresql.info.user}:@{postgresql.info.host}:{postgresql.info.port}/{postgresql.info.dbname}'

        engine = create_engine(connection, echo=False, poolclass=NullPool)
        pyramid_basemodel.Session = scoped_session(sessionmaker(extension=ZopeTransactionExtension()))
        pyramid_basemodel.bind_engine(
            engine, pyramid_basemodel.Session, should_create=True, should_drop=True)

        yield pyramid_basemodel.Session

        transaction.commit()
        Base.metadata.drop_all(engine)


    @pytest.fixture
    def user(db_session):
        """Test user fixture."""
        from pyramid_fullauth.models import User
        from tests.tools import DEFAULT_USER

        new_user = User(**DEFAULT_USER)
        db_session.add(new_user)
        transaction.commit()
        return new_user


    def test_remove_last_admin(db_session, user):
        """
        Sample test checks internal login, but shows usage in tests with SQLAlchemy
        """
        user = db_session.merge(user)
        user.is_admin = True
        transaction.commit()
        user = db_session.merge(user)

        with pytest.raises(AttributeError):
            user.is_admin = False
.. note::

    See the original code at `pyramid_fullauth's conftest file <https://github.com/fizyk/pyramid_fullauth/blob/2950e7f4a397b313aaf306d6d1a763ab7d8abf2b/tests/conftest.py#L35>`_.
    Depending on your needs, that in between code can fire alembic migrations in case of sqlalchemy stack or any other code

Maintaining database state outside of the fixtures
--------------------------------------------------

It is possible and appears it's used in other libraries for tests,
to maintain database state with the use of the ``pytest-postgresql`` database
managing functionality:

For this import DatabaseJanitor and use its init and drop methods:


.. code-block:: python

    import pytest
    from pytest_postgresql.janitor import DatabaseJanitor

    @pytest.fixture
    def database(postgresql_proc):
        # variable definition

        janitor = DatabaseJanitor(
            user=postgresql_proc.user,
            host=postgresql_proc.host,
            proc=postgresql_proc.port,
            testdb="my_test_database",
            version=postgresql_proc.version,
            password="secret_password",
        )
        janitor.init()
        yield psycopg2.connect(
            dbname="my_test_database",
            user=postgresql_proc.user,
            password="secret_password",
            host=postgresql_proc.host,
            port=postgresql_proc.port,
        )
        janitor.drop()

or use it as a context manager:

.. code-block:: python

    import pytest
    from pytest_postgresql.janitor import DatabaseJanitor

    @pytest.fixture
    def database(postgresql_proc):
        # variable definition

        with DatabaseJanitor(
            user=postgresql_proc.user,
            host=postgresql_proc.host,
            port=postgresql_proc.port,
            dbname="my_test_database",
            version=postgresql_proc.version,
            password="secret_password",
        ):
            yield psycopg2.connect(
                dbname="my_test_database",
                user=postgresql_proc.user,
                password="secret_password",
                host=postgresql_proc.host,
                port=postgresql_proc.port,
            )

.. note::

    DatabaseJanitor manages the state of the database, but you'll have to create
    connection to use in test code yourself.

    You can optionally pass in a recognized postgresql ISOLATION_LEVEL for
    additional control.

.. note::

    See DatabaseJanitor usage in python's warehouse test code https://github.com/pypa/warehouse/blob/5d15bfe/tests/conftest.py#L127

Connecting to Postgresql (in a docker)
--------------------------------------

To connect to a docker run postgresql and run test on it, use noproc fixtures.

.. code-block:: sh

    docker run --name some-postgres -e POSTGRES_PASSWORD=mysecretpassword -d postgres

This will start postgresql in a docker container, however using a postgresql installed locally is not much different.

In tests, make sure that all your tests are using **postgresql_noproc** fixture like that:

.. code-block:: python

    from pytest_postgresql import factories


    postgresql_in_docker = factories.postgresql_noproc()
    postgresql = factories.postgresql("postgresql_in_docker", dbname="test")


    def test_postgres_docker(postgresql):
        """Run test."""
        cur = postgresql.cursor()
        cur.execute("CREATE TABLE test (id serial PRIMARY KEY, num integer, data varchar);")
        postgresql.commit()
        cur.close()

And run tests:

.. code-block:: sh

    pytest --postgresql-host=172.17.0.2 --postgresql-password=mysecretpassword

Basic database state for all tests
----------------------------------

If you've got several tests that require common initialisation, you can to define a `load` and pass it to
your custom postgresql process fixture:

.. code-block:: python

    import pytest_postgresql.factories
    def load_database(**kwargs):
        db_connection: connection = psycopg2.connect(**kwargs)
        with db_connection.cursor() as cur:
            cur.execute("CREATE TABLE stories (id serial PRIMARY KEY, name varchar);")
            cur.execute(
                "INSERT INTO stories (name) VALUES"
                "('Silmarillion'), ('Star Wars'), ('The Expanse'), ('Battlestar Galactica')"
            )
            db_connection.commit()

    postgresql_proc = factories.postgresql_proc(
        load=[load_database],
    )

    postgresql = factories.postgresql(
        "postgresql_proc",
    )

The way this will work is that the process fixture will populate template database,
which in turn will be used automatically by client fixture to create a test database from scratch.
Fast, clean and no dangling transactions, that could be accidentally rolled back.

Same approach will work with noproces fixture, while connecting to already running postgresql instance whether
it'll be on a docker machine or running remotely or locally.

Using SQLAlchemy to initialise basic database state
+++++++++++++++++++++++++++++++++++++++++++++++++++

How to use SQLAlchemy for common initialisation:

.. code-block:: python

    def load_database(**kwargs):
        connection = f"postgresql+psycopg2://{kwargs['user']}:@{kwargs['host']}:{kwargs['port']}/{kwargs['dbname']}"
        engine = create_engine(connection)
        Base.metadata.create_all(engine)
        session = scoped_session(sessionmaker(bind=engine))
        # add things to session
        session.commit()

    postgresql_proc = factories.postgresql_proc(load=[load_database])

    postgresql = factories.postgresql('postgresql_proc') # still need to check if this is actually needed or not

    @pytest.fixture
    def dbsession(postgresql):
        connection = f'postgresql+psycopg2://{postgresql.info.user}:@{postgresql.info.host}:{postgresql.info.port}/{postgresql.info.dbname}'
        engine = create_engine(connection)

        session = scoped_session(sessionmaker(bind=engine))

        yield session
        # 'Base.metadata.drop_all(engine)' here specifically does not work. It is also not needed. If you leave out the session.close()
        # all the tests still run, but you get a warning/error at the end of the tests.
        session.close()


Release
=======

Install pipenv and --dev dependencies first, Then run:

.. code-block:: sh

    pipenv run tbump [NEW_VERSION]

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "pytest-postgresql",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "tests, pytest, fixture, postgresql",
    "author": null,
    "author_email": "Grzegorz \u015aliwi\u0144ski <fizyk+pypi@fizyk.dev>",
    "download_url": "https://files.pythonhosted.org/packages/cf/87/aa071042c17b4d81def7ec2f90a929f881f2aa82ae0987f21a589aa20f84/pytest_postgresql-7.0.0.tar.gz",
    "platform": null,
    "description": ".. image:: https://raw.githubusercontent.com/dbfixtures/pytest-postgresql/master/logo.png\n    :width: 100px\n    :height: 100px\n\npytest-postgresql\n=================\n\n.. image:: https://img.shields.io/pypi/v/pytest-postgresql.svg\n    :target: https://pypi.python.org/pypi/pytest-postgresql/\n    :alt: Latest PyPI version\n\n.. image:: https://img.shields.io/pypi/wheel/pytest-postgresql.svg\n    :target: https://pypi.python.org/pypi/pytest-postgresql/\n    :alt: Wheel Status\n\n.. image:: https://img.shields.io/pypi/pyversions/pytest-postgresql.svg\n    :target: https://pypi.python.org/pypi/pytest-postgresql/\n    :alt: Supported Python Versions\n\n.. image:: https://img.shields.io/pypi/l/pytest-postgresql.svg\n    :target: https://pypi.python.org/pypi/pytest-postgresql/\n    :alt: License\n\nWhat is this?\n=============\n\nThis is a pytest plugin, that enables you to test your code that relies on a running PostgreSQL Database.\nIt allows you to specify fixtures for PostgreSQL process and client.\n\nHow to use\n==========\n\n.. warning::\n\n    Tested on PostgreSQL versions >= 10. See tests for more details.\n\nInstall with:\n\n.. code-block:: sh\n\n    pip install pytest-postgresql\n\nYou will also need to install ``psycopg``. See `its installation instructions <https://www.psycopg.org/psycopg3/docs/basic/install.html>`_.\nNote that this plugin requires ``psycopg`` version 3. It is possible to simultaneously install version 3\nand version 2 for libraries that require the latter (see `those instructions <https://www.psycopg.org/docs/install.html>`_).\n\nPlugin contains three fixtures:\n\n* **postgresql** - it's a client fixture that has functional scope.\n  After each test it ends all leftover connections, and drops test database\n  from PostgreSQL ensuring repeatability.\n  This fixture returns already connected psycopg connection.\n\n* **postgresql_proc** - session scoped fixture, that starts PostgreSQL instance\n  at it's first use and stops at the end of the tests.\n* **postgresql_noproc** - a noprocess fixture, that's connecting to already\n  running postgresql instance.\n  For example on dockerized test environments, or CI providing postgresql services\n\nSimply include one of these fixtures into your tests fixture list.\n\nYou can also create additional postgresql client and process fixtures if you'd need to:\n\n\n.. code-block:: python\n\n    from pytest_postgresql import factories\n\n    postgresql_my_proc = factories.postgresql_proc(\n        port=None, unixsocketdir='/var/run')\n    postgresql_my = factories.postgresql('postgresql_my_proc')\n\n.. note::\n\n    Each PostgreSQL process fixture can be configured in a different way than the others through the fixture factory arguments.\n\nSample test\n\n.. code-block:: python\n\n    def test_example_postgres(postgresql):\n        \"\"\"Check main postgresql fixture.\"\"\"\n        cur = postgresql.cursor()\n        cur.execute(\"CREATE TABLE test (id serial PRIMARY KEY, num integer, data varchar);\")\n        postgresql.commit()\n        cur.close()\n\nPre-populating the database for tests\n-------------------------------------\n\nIf you want the database fixture to be automatically pre-populated with your schema and data, there are two lewels you can achieve it:\n\n#. per test in a client fixture, by an intermediary fixture between client and your test (or other fixtures)\n#. per session in a process fixture\n\nThe process fixture accepts a load parameter, which accepts these loaders:\n\n* sql file path - which will load and execute sql files\n* loading functions - either by string import path, actual callable.\n  Loading functions will receive **host**, **port**, **user**, **dbname** and **password** arguments and will have to perform\n  connection to the database inside. Or start session in the ORM of your choice to perform actions with given ORM.\n  This way, you'd be able to trigger ORM based data manipulations, or even trigger database migrations programmatically.\n\nThe process fixture pre-populates the database once per test session (at the start of the process fixture),\nand loads the schema and data into the template database. Client fixture then creates test database out of the template database each test,\nwhich significantly **speeds up the tests**.\n\n.. code-block:: python\n\n    from pathlib import Path\n    postgresql_my_proc = factories.postgresql_proc(\n        load=[Path(\"schemafile.sql\"), Path(\"otherschema.sql\"), \"import.path.to.function\", \"import.path.to:otherfunction\", load_this]\n    )\n\nAdditional benefit, is that test code might safely use separate database connection, and can safely test it's behaviour with transactions and rollbacks,\nas tests and code will work on separate database connections.\n\nDefining pre-populate on command line:\n\n.. code-block:: sh\n\n    pytest --postgresql-populate-template=path.to.loading_function --postgresql-populate-template=path.to.other:loading_function --postgresql-populate-template=path/to/file.sql\n\nConnecting to already existing postgresql database\n--------------------------------------------------\n\nSome projects are using already running postgresql servers (ie on docker instances).\nIn order to connect to them, one would be using the ``postgresql_noproc`` fixture.\n\n.. code-block:: python\n\n    postgresql_external = factories.postgresql('postgresql_noproc')\n\nBy default the  ``postgresql_noproc`` fixture would connect to postgresql instance using **5432** port. Standard configuration options apply to it.\n\nThese are the configuration options that are working on all levels with the ``postgresql_noproc`` fixture:\n\nConfiguration\n=============\n\nYou can define your settings in three ways, it's fixture factory argument, command line option and pytest.ini configuration option.\nYou can pick which you prefer, but remember that these settings are handled in the following order:\n\n    * ``Fixture factory argument``\n    * ``Command line option``\n    * ``Configuration option in your pytest.ini file``\n\n\n.. list-table:: Configuration options\n   :header-rows: 1\n\n   * - PostgreSQL option\n     - Fixture factory argument\n     - Command line option\n     - pytest.ini option\n     - Noop process fixture\n     - Default\n   * - Path to executable\n     - executable\n     - --postgresql-exec\n     - postgresql_exec\n     - -\n     - /usr/lib/postgresql/13/bin/pg_ctl\n   * - host\n     - host\n     - --postgresql-host\n     - postgresql_host\n     - yes\n     - 127.0.0.1\n   * - port\n     - port\n     - --postgresql-port\n     - postgresql_port\n     - yes (5432)\n     - random\n   * - Port search count\n     -\n     - --postgresql-port-search-count\n     - postgresql_port_search_count\n     - -\n     - 5\n   * - postgresql user\n     - user\n     - --postgresql-user\n     - postgresql_user\n     - yes\n     - postgres\n   * - password\n     - password\n     - --postgresql-password\n     - postgresql_password\n     - yes\n     -\n   * - Starting parameters (extra pg_ctl arguments)\n     - startparams\n     - --postgresql-startparams\n     - postgresql_startparams\n     - -\n     - -w\n   * - Postgres exe extra arguments (passed via pg_ctl's -o argument)\n     - postgres_options\n     - --postgresql-postgres-options\n     - postgresql_postgres_options\n     - -\n     -\n   * - Location for unixsockets\n     - unixsocket\n     - --postgresql-unixsocketdir\n     - postgresql_unixsocketdir\n     - -\n     - $TMPDIR\n   * - Database name which will be created by the fixtures\n     - dbname\n     - --postgresql-dbname\n     - postgresql_dbname\n     - yes, however with xdist an index is being added to name, resulting in test0, test1 for each worker.\n     - test\n   * - Default Schema either in sql files or import path to function that will load it (list of values for each)\n     - load\n     - --postgresql-load\n     - postgresql_load\n     - yes\n     -\n   * - PostgreSQL connection options\n     - options\n     - --postgresql-options\n     - postgresql_options\n     - yes\n     -\n   * - Drop test database on start.\n\n       .. warning::\n\n           Use carefully as it might lead to unexpected results within your test suite.\n     -\n     - --postgresql-drop-test-database\n     -\n     - false\n     -\n\n\n\n\nExample usage:\n\n* pass it as an argument in your own fixture\n\n    .. code-block:: python\n\n        postgresql_proc = factories.postgresql_proc(\n            port=8888)\n\n* use ``--postgresql-port`` command line option when you run your tests\n\n    .. code-block:: sh\n\n        py.test tests --postgresql-port=8888\n\n\n* specify your port as ``postgresql_port`` in your ``pytest.ini`` file.\n\n    To do so, put a line like the following under the ``[pytest]`` section of your ``pytest.ini``:\n\n    .. code-block:: ini\n\n        [pytest]\n        postgresql_port = 8888\n\nExamples\n========\n\nPopulating database for tests\n-----------------------------\n\nWith SQLAlchemy\n+++++++++++++++\n\nThis example shows how to populate database and create an SQLAlchemy's ORM connection:\n\nSample below is simplified session fixture from\n`pyramid_fullauth <https://github.com/fizyk/pyramid_fullauth/>`_ tests:\n\n.. code-block:: python\n\n    from sqlalchemy import create_engine\n    from sqlalchemy.orm import scoped_session, sessionmaker\n    from sqlalchemy.pool import NullPool\n    from zope.sqlalchemy import register\n\n\n    @pytest.fixture\n    def db_session(postgresql):\n        \"\"\"Session for SQLAlchemy.\"\"\"\n        from pyramid_fullauth.models import Base\n\n        connection = f'postgresql+psycopg2://{postgresql.info.user}:@{postgresql.info.host}:{postgresql.info.port}/{postgresql.info.dbname}'\n\n        engine = create_engine(connection, echo=False, poolclass=NullPool)\n        pyramid_basemodel.Session = scoped_session(sessionmaker(extension=ZopeTransactionExtension()))\n        pyramid_basemodel.bind_engine(\n            engine, pyramid_basemodel.Session, should_create=True, should_drop=True)\n\n        yield pyramid_basemodel.Session\n\n        transaction.commit()\n        Base.metadata.drop_all(engine)\n\n\n    @pytest.fixture\n    def user(db_session):\n        \"\"\"Test user fixture.\"\"\"\n        from pyramid_fullauth.models import User\n        from tests.tools import DEFAULT_USER\n\n        new_user = User(**DEFAULT_USER)\n        db_session.add(new_user)\n        transaction.commit()\n        return new_user\n\n\n    def test_remove_last_admin(db_session, user):\n        \"\"\"\n        Sample test checks internal login, but shows usage in tests with SQLAlchemy\n        \"\"\"\n        user = db_session.merge(user)\n        user.is_admin = True\n        transaction.commit()\n        user = db_session.merge(user)\n\n        with pytest.raises(AttributeError):\n            user.is_admin = False\n.. note::\n\n    See the original code at `pyramid_fullauth's conftest file <https://github.com/fizyk/pyramid_fullauth/blob/2950e7f4a397b313aaf306d6d1a763ab7d8abf2b/tests/conftest.py#L35>`_.\n    Depending on your needs, that in between code can fire alembic migrations in case of sqlalchemy stack or any other code\n\nMaintaining database state outside of the fixtures\n--------------------------------------------------\n\nIt is possible and appears it's used in other libraries for tests,\nto maintain database state with the use of the ``pytest-postgresql`` database\nmanaging functionality:\n\nFor this import DatabaseJanitor and use its init and drop methods:\n\n\n.. code-block:: python\n\n    import pytest\n    from pytest_postgresql.janitor import DatabaseJanitor\n\n    @pytest.fixture\n    def database(postgresql_proc):\n        # variable definition\n\n        janitor = DatabaseJanitor(\n            user=postgresql_proc.user,\n            host=postgresql_proc.host,\n            proc=postgresql_proc.port,\n            testdb=\"my_test_database\",\n            version=postgresql_proc.version,\n            password=\"secret_password\",\n        )\n        janitor.init()\n        yield psycopg2.connect(\n            dbname=\"my_test_database\",\n            user=postgresql_proc.user,\n            password=\"secret_password\",\n            host=postgresql_proc.host,\n            port=postgresql_proc.port,\n        )\n        janitor.drop()\n\nor use it as a context manager:\n\n.. code-block:: python\n\n    import pytest\n    from pytest_postgresql.janitor import DatabaseJanitor\n\n    @pytest.fixture\n    def database(postgresql_proc):\n        # variable definition\n\n        with DatabaseJanitor(\n            user=postgresql_proc.user,\n            host=postgresql_proc.host,\n            port=postgresql_proc.port,\n            dbname=\"my_test_database\",\n            version=postgresql_proc.version,\n            password=\"secret_password\",\n        ):\n            yield psycopg2.connect(\n                dbname=\"my_test_database\",\n                user=postgresql_proc.user,\n                password=\"secret_password\",\n                host=postgresql_proc.host,\n                port=postgresql_proc.port,\n            )\n\n.. note::\n\n    DatabaseJanitor manages the state of the database, but you'll have to create\n    connection to use in test code yourself.\n\n    You can optionally pass in a recognized postgresql ISOLATION_LEVEL for\n    additional control.\n\n.. note::\n\n    See DatabaseJanitor usage in python's warehouse test code https://github.com/pypa/warehouse/blob/5d15bfe/tests/conftest.py#L127\n\nConnecting to Postgresql (in a docker)\n--------------------------------------\n\nTo connect to a docker run postgresql and run test on it, use noproc fixtures.\n\n.. code-block:: sh\n\n    docker run --name some-postgres -e POSTGRES_PASSWORD=mysecretpassword -d postgres\n\nThis will start postgresql in a docker container, however using a postgresql installed locally is not much different.\n\nIn tests, make sure that all your tests are using **postgresql_noproc** fixture like that:\n\n.. code-block:: python\n\n    from pytest_postgresql import factories\n\n\n    postgresql_in_docker = factories.postgresql_noproc()\n    postgresql = factories.postgresql(\"postgresql_in_docker\", dbname=\"test\")\n\n\n    def test_postgres_docker(postgresql):\n        \"\"\"Run test.\"\"\"\n        cur = postgresql.cursor()\n        cur.execute(\"CREATE TABLE test (id serial PRIMARY KEY, num integer, data varchar);\")\n        postgresql.commit()\n        cur.close()\n\nAnd run tests:\n\n.. code-block:: sh\n\n    pytest --postgresql-host=172.17.0.2 --postgresql-password=mysecretpassword\n\nBasic database state for all tests\n----------------------------------\n\nIf you've got several tests that require common initialisation, you can to define a `load` and pass it to\nyour custom postgresql process fixture:\n\n.. code-block:: python\n\n    import pytest_postgresql.factories\n    def load_database(**kwargs):\n        db_connection: connection = psycopg2.connect(**kwargs)\n        with db_connection.cursor() as cur:\n            cur.execute(\"CREATE TABLE stories (id serial PRIMARY KEY, name varchar);\")\n            cur.execute(\n                \"INSERT INTO stories (name) VALUES\"\n                \"('Silmarillion'), ('Star Wars'), ('The Expanse'), ('Battlestar Galactica')\"\n            )\n            db_connection.commit()\n\n    postgresql_proc = factories.postgresql_proc(\n        load=[load_database],\n    )\n\n    postgresql = factories.postgresql(\n        \"postgresql_proc\",\n    )\n\nThe way this will work is that the process fixture will populate template database,\nwhich in turn will be used automatically by client fixture to create a test database from scratch.\nFast, clean and no dangling transactions, that could be accidentally rolled back.\n\nSame approach will work with noproces fixture, while connecting to already running postgresql instance whether\nit'll be on a docker machine or running remotely or locally.\n\nUsing SQLAlchemy to initialise basic database state\n+++++++++++++++++++++++++++++++++++++++++++++++++++\n\nHow to use SQLAlchemy for common initialisation:\n\n.. code-block:: python\n\n    def load_database(**kwargs):\n        connection = f\"postgresql+psycopg2://{kwargs['user']}:@{kwargs['host']}:{kwargs['port']}/{kwargs['dbname']}\"\n        engine = create_engine(connection)\n        Base.metadata.create_all(engine)\n        session = scoped_session(sessionmaker(bind=engine))\n        # add things to session\n        session.commit()\n\n    postgresql_proc = factories.postgresql_proc(load=[load_database])\n\n    postgresql = factories.postgresql('postgresql_proc') # still need to check if this is actually needed or not\n\n    @pytest.fixture\n    def dbsession(postgresql):\n        connection = f'postgresql+psycopg2://{postgresql.info.user}:@{postgresql.info.host}:{postgresql.info.port}/{postgresql.info.dbname}'\n        engine = create_engine(connection)\n\n        session = scoped_session(sessionmaker(bind=engine))\n\n        yield session\n        # 'Base.metadata.drop_all(engine)' here specifically does not work. It is also not needed. If you leave out the session.close()\n        # all the tests still run, but you get a warning/error at the end of the tests.\n        session.close()\n\n\nRelease\n=======\n\nInstall pipenv and --dev dependencies first, Then run:\n\n.. code-block:: sh\n\n    pipenv run tbump [NEW_VERSION]\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Postgresql fixtures and fixture factories for Pytest.",
    "version": "7.0.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/dbfixtures/pytest-postgresql/issues",
        "Changelog": "https://github.com/dbfixtures/pytest-postgresql/blob/v7.0.0/CHANGES.rst",
        "Source": "https://github.com/dbfixtures/pytest-postgresql"
    },
    "split_keywords": [
        "tests",
        " pytest",
        " fixture",
        " postgresql"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "10c3e5099d7049e49a6beeaf6cac511ecd50013284200644285d761eae71d2ea",
                "md5": "d6e910ad1e7ebf413ac483e449ce4b16",
                "sha256": "aaebadbf060b85cca7755fdf5ed7aa2929edd0f842c9b7f56ffe1e58e0d3b749"
            },
            "downloads": -1,
            "filename": "pytest_postgresql-7.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d6e910ad1e7ebf413ac483e449ce4b16",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 41266,
            "upload_time": "2025-02-23T15:01:12",
            "upload_time_iso_8601": "2025-02-23T15:01:12.237861Z",
            "url": "https://files.pythonhosted.org/packages/10/c3/e5099d7049e49a6beeaf6cac511ecd50013284200644285d761eae71d2ea/pytest_postgresql-7.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "cf87aa071042c17b4d81def7ec2f90a929f881f2aa82ae0987f21a589aa20f84",
                "md5": "3a6555a675c5b9854814aeb513255e13",
                "sha256": "cf0016cee5d9ac06f50cfc61bb0597d1fa90780d77c4453bc18e4930cae04aaa"
            },
            "downloads": -1,
            "filename": "pytest_postgresql-7.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "3a6555a675c5b9854814aeb513255e13",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 49877,
            "upload_time": "2025-02-23T15:01:14",
            "upload_time_iso_8601": "2025-02-23T15:01:14.678239Z",
            "url": "https://files.pythonhosted.org/packages/cf/87/aa071042c17b4d81def7ec2f90a929f881f2aa82ae0987f21a589aa20f84/pytest_postgresql-7.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-23 15:01:14",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "dbfixtures",
    "github_project": "pytest-postgresql",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "lcname": "pytest-postgresql"
}
        
Elapsed time: 7.90216s