Name | sdk-pytest-checkmate JSON |
Version |
0.1.0a4
JSON |
| download |
home_page | None |
Summary | Pytest plugin for enriched HTML test reporting with steps, soft assertions, and data attachments |
upload_time | 2025-08-19 21:23:08 |
maintainer | None |
docs_url | None |
author | None |
requires_python | <3.14,>=3.10 |
license | MIT License Copyright (c) 2025 o73k51i Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
keywords |
plugin
pytest
report
soft-assertions
step
test
test-reporting
test-steps
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# sdk-pytest-checkmate
[](https://badge.fury.io/py/sdk-pytest-checkmate)
[](https://pypi.org/project/sdk-pytest-checkmate/)
[](https://opensource.org/licenses/MIT)
A pytest plugin for enriched HTML test reporting with support for test steps, soft assertions, and data attachments.
## Features
- 🔄 **Test Steps**: Record named steps with timing using context managers
- 🔍 **Soft Assertions**: Non-fatal assertions that collect failures without stopping tests
- 📊 **Data Attachments**: Attach arbitrary data objects to test timelines
- 📋 **Epic/Story Grouping**: Organize tests with `@pytest.mark.epic` and `@pytest.mark.story`
- 📈 **Interactive HTML Reports**: Rich reports with filtering, collapsible sections, and inline data
- ⚡ **Async Support**: Works with both sync and async test functions
## Installation
```bash
pip install sdk-pytest-checkmate
```
The plugin automatically activates when installed - no additional configuration needed.
## Quick Start
```python
from sdk_pytest_checkmate import step, soft_assert, add_data_report
import pytest
@pytest.mark.epic("User Management")
@pytest.mark.story("User Registration")
@pytest.mark.title("Complete user registration flow")
def test_user_registration():
with step("Prepare test data"):
user_data = {
"username": "testuser",
"email": "test@example.com",
"password": "secure123"
}
add_data_report(user_data, "Registration Data")
with step("Submit registration form"):
response = submit_registration(user_data)
soft_assert(response.status_code == 201, "Registration should return 201")
soft_assert("id" in response.json(), "Response should contain user ID")
with step("Verify user activation"):
user = get_user_by_email(user_data["email"])
soft_assert(user.is_active, "User should be activated")
add_data_report(user.__dict__, "Created User")
# Final critical assertion
assert user.email == user_data["email"], "Email should match"
```
## Generating Reports
Generate an HTML report with your test results:
```bash
# Basic HTML report
pytest --report-html=report.html
# Custom title and JSON export
pytest --report-html=results.html --report-title="My Test Suite" --report-json=results.json
# Run specific tests with reporting
pytest tests/test_integration.py --report-html=integration-report.html
```
## API Reference
### step(name: str)
Context manager for recording test steps with timing information.
```python
# Synchronous usage
with step("Login user"):
login_response = authenticate(username, password)
# Asynchronous usage
async with step("Fetch user data"):
user_data = await api_client.get_user(user_id)
```
**Parameters:**
- `name`: Human-readable step name that appears in the HTML report
### soft_assert(condition: bool, message: str = None) -> bool
Record a non-fatal assertion that doesn't immediately fail the test.
```python
# Basic soft assertion
soft_assert(response.status_code == 200, "API should return 200")
# Multiple soft assertions - test continues even if some fail
soft_assert(user.name is not None, "User should have a name")
soft_assert(user.email.endswith("@company.com"), "Email should be company domain")
soft_assert(len(user.permissions) > 0, "User should have permissions")
# Test will be marked as failed if any soft assertions failed
```
**Parameters:**
- `condition`: Boolean expression to evaluate
- `message`: Optional descriptive message (defaults to "Soft assertion")
**Returns:** The boolean value of `condition`
### add_data_report(data: Any, label: str) -> DataRecord
Attach arbitrary data to the test timeline for inspection in HTML reports.
```python
# Attach configuration
config = {"endpoint": "/api/users", "timeout": 30}
add_data_report(config, "API Config")
# Attach response data
add_data_report({
"status_code": response.status_code,
"headers": dict(response.headers),
"body": response.json()
}, "API Response")
# Attach custom objects
add_data_report(user_profile.__dict__, "User Profile")
```
**Parameters:**
- `data`: Any Python object (dict/list will be pretty-printed as JSON)
- `label`: Short label shown in the report UI
## Markers
### @pytest.mark.title(name)
Set a human-friendly test title that appears in reports instead of the function name.
```python
@pytest.mark.title("User can successfully log in with valid credentials")
def test_login_valid_user():
pass
```
### @pytest.mark.epic(name)
Group tests under an epic for better organization in reports.
```python
@pytest.mark.epic("User Authentication")
def test_login():
pass
@pytest.mark.epic("User Authentication")
def test_logout():
pass
```
### @pytest.mark.story(name)
Group tests under a story (nested under epics) for hierarchical organization.
```python
@pytest.mark.epic("E-commerce")
@pytest.mark.story("Shopping Cart")
def test_add_item_to_cart():
pass
@pytest.mark.epic("E-commerce")
@pytest.mark.story("Shopping Cart")
def test_remove_item_from_cart():
pass
```
## Command Line Options
- `--report-html[=PATH]`: Generate HTML report (default: `report.html`)
- `--report-title=TITLE`: Set custom title for HTML report (default: "Pytest report")
- `--report-json=PATH`: Export results as JSON file
## HTML Report Features
The generated HTML reports include:
- **Interactive Filtering**: Filter tests by status (PASSED, FAILED, SKIPPED, etc.)
- **Collapsible Sections**: Expand/collapse epic and story groups
- **Timeline View**: See steps, soft assertions, and data in chronological order
- **Data Inspection**: Click to expand attached data objects
- **Error Details**: Full tracebacks and soft assertion summaries
- **Performance Metrics**: Step timing and total test duration
## Requirements
- Python 3.10+
- pytest 8.4.1+
## License
MIT License - see [LICENSE](LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "sdk-pytest-checkmate",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.14,>=3.10",
"maintainer_email": null,
"keywords": "plugin, pytest, report, soft-assertions, step, test, test-reporting, test-steps",
"author": null,
"author_email": "o73k51i <o73k51i@proton.me>",
"download_url": "https://files.pythonhosted.org/packages/02/00/e70e0fad205214aa8733f83ebf2a33b9af5271f5070f8d52d849326b398c/sdk_pytest_checkmate-0.1.0a4.tar.gz",
"platform": null,
"description": "# sdk-pytest-checkmate\n\n[](https://badge.fury.io/py/sdk-pytest-checkmate)\n[](https://pypi.org/project/sdk-pytest-checkmate/)\n[](https://opensource.org/licenses/MIT)\n\nA pytest plugin for enriched HTML test reporting with support for test steps, soft assertions, and data attachments.\n\n## Features\n\n- \ud83d\udd04 **Test Steps**: Record named steps with timing using context managers\n- \ud83d\udd0d **Soft Assertions**: Non-fatal assertions that collect failures without stopping tests\n- \ud83d\udcca **Data Attachments**: Attach arbitrary data objects to test timelines\n- \ud83d\udccb **Epic/Story Grouping**: Organize tests with `@pytest.mark.epic` and `@pytest.mark.story`\n- \ud83d\udcc8 **Interactive HTML Reports**: Rich reports with filtering, collapsible sections, and inline data\n- \u26a1 **Async Support**: Works with both sync and async test functions\n\n## Installation\n\n```bash\npip install sdk-pytest-checkmate\n```\n\nThe plugin automatically activates when installed - no additional configuration needed.\n\n## Quick Start\n\n```python\nfrom sdk_pytest_checkmate import step, soft_assert, add_data_report\nimport pytest\n\n@pytest.mark.epic(\"User Management\")\n@pytest.mark.story(\"User Registration\")\n@pytest.mark.title(\"Complete user registration flow\")\ndef test_user_registration():\n with step(\"Prepare test data\"):\n user_data = {\n \"username\": \"testuser\",\n \"email\": \"test@example.com\",\n \"password\": \"secure123\"\n }\n add_data_report(user_data, \"Registration Data\")\n \n with step(\"Submit registration form\"):\n response = submit_registration(user_data)\n soft_assert(response.status_code == 201, \"Registration should return 201\")\n soft_assert(\"id\" in response.json(), \"Response should contain user ID\")\n \n with step(\"Verify user activation\"):\n user = get_user_by_email(user_data[\"email\"])\n soft_assert(user.is_active, \"User should be activated\")\n add_data_report(user.__dict__, \"Created User\")\n \n # Final critical assertion\n assert user.email == user_data[\"email\"], \"Email should match\"\n```\n\n## Generating Reports\n\nGenerate an HTML report with your test results:\n\n```bash\n# Basic HTML report\npytest --report-html=report.html\n\n# Custom title and JSON export\npytest --report-html=results.html --report-title=\"My Test Suite\" --report-json=results.json\n\n# Run specific tests with reporting\npytest tests/test_integration.py --report-html=integration-report.html\n```\n\n## API Reference\n\n### step(name: str)\n\nContext manager for recording test steps with timing information.\n\n```python\n# Synchronous usage\nwith step(\"Login user\"):\n login_response = authenticate(username, password)\n\n# Asynchronous usage \nasync with step(\"Fetch user data\"):\n user_data = await api_client.get_user(user_id)\n```\n\n**Parameters:**\n- `name`: Human-readable step name that appears in the HTML report\n\n### soft_assert(condition: bool, message: str = None) -> bool\n\nRecord a non-fatal assertion that doesn't immediately fail the test.\n\n```python\n# Basic soft assertion\nsoft_assert(response.status_code == 200, \"API should return 200\")\n\n# Multiple soft assertions - test continues even if some fail\nsoft_assert(user.name is not None, \"User should have a name\")\nsoft_assert(user.email.endswith(\"@company.com\"), \"Email should be company domain\")\nsoft_assert(len(user.permissions) > 0, \"User should have permissions\")\n\n# Test will be marked as failed if any soft assertions failed\n```\n\n**Parameters:**\n- `condition`: Boolean expression to evaluate\n- `message`: Optional descriptive message (defaults to \"Soft assertion\")\n\n**Returns:** The boolean value of `condition`\n\n### add_data_report(data: Any, label: str) -> DataRecord\n\nAttach arbitrary data to the test timeline for inspection in HTML reports.\n\n```python\n# Attach configuration\nconfig = {\"endpoint\": \"/api/users\", \"timeout\": 30}\nadd_data_report(config, \"API Config\")\n\n# Attach response data\nadd_data_report({\n \"status_code\": response.status_code,\n \"headers\": dict(response.headers),\n \"body\": response.json()\n}, \"API Response\")\n\n# Attach custom objects\nadd_data_report(user_profile.__dict__, \"User Profile\")\n```\n\n**Parameters:**\n- `data`: Any Python object (dict/list will be pretty-printed as JSON)\n- `label`: Short label shown in the report UI\n\n## Markers\n\n### @pytest.mark.title(name)\n\nSet a human-friendly test title that appears in reports instead of the function name.\n\n```python\n@pytest.mark.title(\"User can successfully log in with valid credentials\")\ndef test_login_valid_user():\n pass\n```\n\n### @pytest.mark.epic(name)\n\nGroup tests under an epic for better organization in reports.\n\n```python\n@pytest.mark.epic(\"User Authentication\")\ndef test_login():\n pass\n\n@pytest.mark.epic(\"User Authentication\") \ndef test_logout():\n pass\n```\n\n### @pytest.mark.story(name)\n\nGroup tests under a story (nested under epics) for hierarchical organization.\n\n```python\n@pytest.mark.epic(\"E-commerce\")\n@pytest.mark.story(\"Shopping Cart\")\ndef test_add_item_to_cart():\n pass\n\n@pytest.mark.epic(\"E-commerce\")\n@pytest.mark.story(\"Shopping Cart\")\ndef test_remove_item_from_cart():\n pass\n```\n\n## Command Line Options\n\n- `--report-html[=PATH]`: Generate HTML report (default: `report.html`)\n- `--report-title=TITLE`: Set custom title for HTML report (default: \"Pytest report\")\n- `--report-json=PATH`: Export results as JSON file\n\n## HTML Report Features\n\nThe generated HTML reports include:\n\n- **Interactive Filtering**: Filter tests by status (PASSED, FAILED, SKIPPED, etc.)\n- **Collapsible Sections**: Expand/collapse epic and story groups\n- **Timeline View**: See steps, soft assertions, and data in chronological order\n- **Data Inspection**: Click to expand attached data objects\n- **Error Details**: Full tracebacks and soft assertion summaries\n- **Performance Metrics**: Step timing and total test duration\n\n## Requirements\n\n- Python 3.10+\n- pytest 8.4.1+\n\n## License\n\nMIT License - see [LICENSE](LICENSE) file for details.\n",
"bugtrack_url": null,
"license": "MIT License Copyright (c) 2025 o73k51i Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
"summary": "Pytest plugin for enriched HTML test reporting with steps, soft assertions, and data attachments",
"version": "0.1.0a4",
"project_urls": {
"Bug Tracker": "https://github.com/o73k51i/sdk-pytest-checkmate/issues",
"Documentation": "https://github.com/o73k51i/sdk-pytest-checkmate#readme",
"Homepage": "https://github.com/o73k51i/sdk-pytest-checkmate",
"Repository": "https://github.com/o73k51i/sdk-pytest-checkmate.git"
},
"split_keywords": [
"plugin",
" pytest",
" report",
" soft-assertions",
" step",
" test",
" test-reporting",
" test-steps"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "5661f7fa92b68507b108be15f52feb0444006fc4d6066851095095cb9124c6ea",
"md5": "257532ecc9551e0adfc8f0e5eac5638c",
"sha256": "5ba41292dd6ed61bde2bf216b3e78b42ccc9b781bdb1169c0a3ba155273ca326"
},
"downloads": -1,
"filename": "sdk_pytest_checkmate-0.1.0a4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "257532ecc9551e0adfc8f0e5eac5638c",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.14,>=3.10",
"size": 18810,
"upload_time": "2025-08-19T21:23:06",
"upload_time_iso_8601": "2025-08-19T21:23:06.959241Z",
"url": "https://files.pythonhosted.org/packages/56/61/f7fa92b68507b108be15f52feb0444006fc4d6066851095095cb9124c6ea/sdk_pytest_checkmate-0.1.0a4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "0200e70e0fad205214aa8733f83ebf2a33b9af5271f5070f8d52d849326b398c",
"md5": "a53b2fb0fcc97a5c38c4b31889b46fc1",
"sha256": "757920a6a8d80b758ad8e383b25d9a54db77966fea48debf3c2ba2d1571beb5e"
},
"downloads": -1,
"filename": "sdk_pytest_checkmate-0.1.0a4.tar.gz",
"has_sig": false,
"md5_digest": "a53b2fb0fcc97a5c38c4b31889b46fc1",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.14,>=3.10",
"size": 32653,
"upload_time": "2025-08-19T21:23:08",
"upload_time_iso_8601": "2025-08-19T21:23:08.442022Z",
"url": "https://files.pythonhosted.org/packages/02/00/e70e0fad205214aa8733f83ebf2a33b9af5271f5070f8d52d849326b398c/sdk_pytest_checkmate-0.1.0a4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-19 21:23:08",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "o73k51i",
"github_project": "sdk-pytest-checkmate",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "sdk-pytest-checkmate"
}