pytest-performancetotal


Namepytest-performancetotal JSON
Version 0.2.7 PyPI version JSON
download
home_pagehttps://github.com/tzurp/pytest_performancetotal
SummaryA performance plugin for pytest
upload_time2025-01-07 05:04:15
maintainerTzur Paldi
docs_urlNone
authorTzur Paldi
requires_pythonNone
licenseMIT
keywords pytest plugin performance playwright
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # pytest-performancetotal

With this plugin for [pytest](https://github.com/pytest-dev/pytest), which complements the [playwright-pytest](https://github.com/microsoft/playwright-pytest) integration, you can seamlessly incorporate performance analysis into your test flows. It’s designed to work with UI interactions, API calls, or a combination of both, providing a straightforward method for measuring response times and pinpointing potential performance issues within your application. By leveraging this data, you can make strategic decisions to optimize and enhance your application’s performance. For insights into the original concept and additional details, refer to the [article](https://www.linkedin.com/pulse/elevating-your-playwright-tests-plugin-tzur-paldi-phd) on the Node.js version of this plugin.

## Installation

```no-highlight
$ pip install pytest-performancetotal
```

## Usage

To use pytest-performancetotal, simply add the **performancetotal** fixture to the test method. This will include the performance functionality in your test. No further setup is required. Here's an example:

```python
import pytest

@pytest.mark.parametrize("iteration", [1, 2, 3])
def test_features(performancetotal, iteration):
    performancetotal.sample_start("feature1")
    time.sleep(1)
    performancetotal.sample_end("feature1")
    
    performancetotal.sample_start("feature2")
    time.sleep(0.5)
    performancetotal.sample_end("feature2")
```

You can also get immediate time span for a single sample inside a test:

```python
feature1_timespan = performancetotal.get_sample_time("feature1")
```
be aware that get_sample_time returns a single measurement with no statistical analysis.


To use type hints follow this example:

```python
from pytest_performancetotal.performance import Performance

def test_features(performancetotal: Performance, iteration):
            # ... your test code here
```

## Options

### performance-noappend

To disable appending new results into existing file and start fresh every run use:

```no-highlight
pytest --performance-noappend
```

> **⚠️ Caution:**
>
> This action will delete all your performance data permanently. Ensure that you have a backup before proceeding.

### performance-drop-failed-results

To drops results for failed tests use:

```no-highlight
pytest --performance-drop-failed-results
```

### performance-recent-days

To set the umber of days to consider for performance analysis use:

`pytest --performance-recent-days=7` or use day portion like: `pytest --performance-recent-days=0.5`


### Configuring Logging in pytest.ini

This plugin uses the native Python logging module to provide detailed logs during its execution. To ensure you can see these logs during testing, proper configuration is needed. The following instructions will guide you on how to configure pytest to output log messages to the console. This setup is particularly useful for debugging and tracking the behavior of your code.

Steps to Configure Logging:

Create or Update pytest.ini: If you do not already have a pytest.ini file, create one in the root directory of your project. If you have one, open it for editing.

For example add the following configuration in file `pytest.ini`:

```no-highlight
[pytest]
log_cli = true
log_cli_level = DEBUG
log_cli_format = %(asctime)s - %(name)s - %(levelname)s - %(message)s
log_cli_date_format = %Y-%m-%d %H:%M:%S
```

__log_cli__: Enables logging to the console.

__log_cli_level__: Sets the logging level. You can choose from DEBUG, INFO, WARNING, ERROR, or CRITICAL.

__log_cli_format__: Defines the format of the log messages.

__log_cli_date_format__: Specifies the date format used in log messages.

## Getting the results

A new directory named `performance_results` is created inside your project's root folder. Once all the tests are completed, two files are created inside the performance-results directory: `results.json` and `results.csv`. The analyzed data includes average time, standard error of mean (SEM), number of samples, minimum value, maximum value, earliest time, and latest time. The results table is also printed to the terminal log.

## Support

For any questions or suggestions contact me at: [tzur.paldi@outlook.com](mailto:tzur.paldi@outlook.com?subjet=pytest-performancetotal%20Support)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/tzurp/pytest_performancetotal",
    "name": "pytest-performancetotal",
    "maintainer": "Tzur Paldi",
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": "tzur.paldi@outlook.com",
    "keywords": "pytest plugin performance playwright",
    "author": "Tzur Paldi",
    "author_email": "tzur.paldi@outlook.com",
    "download_url": "https://files.pythonhosted.org/packages/03/4d/1884a0f793a0f25b8faf35e21ac7170d6ee2ca5c4a0f65619c088c588998/pytest_performancetotal-0.2.7.tar.gz",
    "platform": null,
    "description": "# pytest-performancetotal\r\n\r\nWith this plugin for [pytest](https://github.com/pytest-dev/pytest), which complements the [playwright-pytest](https://github.com/microsoft/playwright-pytest) integration, you can seamlessly incorporate performance analysis into your test flows. It\u2019s designed to work with UI interactions, API calls, or a combination of both, providing a straightforward method for measuring response times and pinpointing potential performance issues within your application. By leveraging this data, you can make strategic decisions to optimize and enhance your application\u2019s performance. For insights into the original concept and additional details, refer to the [article](https://www.linkedin.com/pulse/elevating-your-playwright-tests-plugin-tzur-paldi-phd) on the Node.js version of this plugin.\r\n\r\n## Installation\r\n\r\n```no-highlight\r\n$ pip install pytest-performancetotal\r\n```\r\n\r\n## Usage\r\n\r\nTo use pytest-performancetotal, simply add the **performancetotal** fixture to the test method. This will include the performance functionality in your test. No further setup is required. Here's an example:\r\n\r\n```python\r\nimport pytest\r\n\r\n@pytest.mark.parametrize(\"iteration\", [1, 2, 3])\r\ndef test_features(performancetotal, iteration):\r\n    performancetotal.sample_start(\"feature1\")\r\n    time.sleep(1)\r\n    performancetotal.sample_end(\"feature1\")\r\n    \r\n    performancetotal.sample_start(\"feature2\")\r\n    time.sleep(0.5)\r\n    performancetotal.sample_end(\"feature2\")\r\n```\r\n\r\nYou can also get immediate time span for a single sample inside a test:\r\n\r\n```python\r\nfeature1_timespan = performancetotal.get_sample_time(\"feature1\")\r\n```\r\nbe aware that get_sample_time returns a single measurement with no statistical analysis.\r\n\r\n\r\nTo use type hints follow this example:\r\n\r\n```python\r\nfrom pytest_performancetotal.performance import Performance\r\n\r\ndef test_features(performancetotal: Performance, iteration):\r\n            # ... your test code here\r\n```\r\n\r\n## Options\r\n\r\n### performance-noappend\r\n\r\nTo disable appending new results into existing file and start fresh every run use:\r\n\r\n```no-highlight\r\npytest --performance-noappend\r\n```\r\n\r\n> **\u26a0\ufe0f Caution:**\r\n>\r\n> This action will delete all your performance data permanently. Ensure that you have a backup before proceeding.\r\n\r\n### performance-drop-failed-results\r\n\r\nTo drops results for failed tests use:\r\n\r\n```no-highlight\r\npytest --performance-drop-failed-results\r\n```\r\n\r\n### performance-recent-days\r\n\r\nTo set the umber of days to consider for performance analysis use:\r\n\r\n`pytest --performance-recent-days=7` or use day portion like: `pytest --performance-recent-days=0.5`\r\n\r\n\r\n### Configuring Logging in pytest.ini\r\n\r\nThis plugin uses the native Python logging module to provide detailed logs during its execution. To ensure you can see these logs during testing, proper configuration is needed. The following instructions will guide you on how to configure pytest to output log messages to the console. This setup is particularly useful for debugging and tracking the behavior of your code.\r\n\r\nSteps to Configure Logging:\r\n\r\nCreate or Update pytest.ini: If you do not already have a pytest.ini file, create one in the root directory of your project. If you have one, open it for editing.\r\n\r\nFor example add the following configuration in file `pytest.ini`:\r\n\r\n```no-highlight\r\n[pytest]\r\nlog_cli = true\r\nlog_cli_level = DEBUG\r\nlog_cli_format = %(asctime)s - %(name)s - %(levelname)s - %(message)s\r\nlog_cli_date_format = %Y-%m-%d %H:%M:%S\r\n```\r\n\r\n__log_cli__: Enables logging to the console.\r\n\r\n__log_cli_level__: Sets the logging level. You can choose from DEBUG, INFO, WARNING, ERROR, or CRITICAL.\r\n\r\n__log_cli_format__: Defines the format of the log messages.\r\n\r\n__log_cli_date_format__: Specifies the date format used in log messages.\r\n\r\n## Getting the results\r\n\r\nA new directory named `performance_results` is created inside your project's root folder. Once all the tests are completed, two files are created inside the performance-results directory: `results.json` and `results.csv`. The analyzed data includes average time, standard error of mean (SEM), number of samples, minimum value, maximum value, earliest time, and latest time. The results table is also printed to the terminal log.\r\n\r\n## Support\r\n\r\nFor any questions or suggestions contact me at: [tzur.paldi@outlook.com](mailto:tzur.paldi@outlook.com?subjet=pytest-performancetotal%20Support)\r\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A performance plugin for pytest",
    "version": "0.2.7",
    "project_urls": {
        "Homepage": "https://github.com/tzurp/pytest_performancetotal"
    },
    "split_keywords": [
        "pytest",
        "plugin",
        "performance",
        "playwright"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "42ea5270c28c41f434d5d3ba148adfc26752d487eb057d0bbbd30e6856e7d21f",
                "md5": "04cc485602ac83533c1369741969c148",
                "sha256": "f3eed19cbdb318deaaf07382d03999ca8c4794c3c846a451861a76012d7a4455"
            },
            "downloads": -1,
            "filename": "pytest_performancetotal-0.2.7-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "04cc485602ac83533c1369741969c148",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 13900,
            "upload_time": "2025-01-07T05:04:13",
            "upload_time_iso_8601": "2025-01-07T05:04:13.214180Z",
            "url": "https://files.pythonhosted.org/packages/42/ea/5270c28c41f434d5d3ba148adfc26752d487eb057d0bbbd30e6856e7d21f/pytest_performancetotal-0.2.7-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "034d1884a0f793a0f25b8faf35e21ac7170d6ee2ca5c4a0f65619c088c588998",
                "md5": "1edfb320228af5ec7c0803c64d19dbe3",
                "sha256": "245d5a721ddac541b3c48e16b24e26e476a28ec7bb2480a277db10a8a0b0c8ed"
            },
            "downloads": -1,
            "filename": "pytest_performancetotal-0.2.7.tar.gz",
            "has_sig": false,
            "md5_digest": "1edfb320228af5ec7c0803c64d19dbe3",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 11849,
            "upload_time": "2025-01-07T05:04:15",
            "upload_time_iso_8601": "2025-01-07T05:04:15.711860Z",
            "url": "https://files.pythonhosted.org/packages/03/4d/1884a0f793a0f25b8faf35e21ac7170d6ee2ca5c4a0f65619c088c588998/pytest_performancetotal-0.2.7.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-01-07 05:04:15",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "tzurp",
    "github_project": "pytest_performancetotal",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "pytest-performancetotal"
}
        
Elapsed time: 1.45942s