# Create base client model for restful libraries
## Badges and quicklinks
### Open project for development in container
[![Open in Remote - Containers](https://img.shields.io/static/v1?label=Remote%20-%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/Leikaab/crudclient)
### Status of project
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/crudclient)
![PyPI - Version](https://img.shields.io/pypi/v/crudclient)
### Status of testing
[![Test DevContainer Build](https://github.com/Leikaab/crudclient/actions/workflows/test_devcontainer.yml/badge.svg)](https://github.com/Leikaab/crudclient/actions/workflows/test_devcontainer.yml)
[![Run Tests](https://github.com/Leikaab/crudclient/actions/workflows/tests.yml/badge.svg)](https://github.com/Leikaab/crudclient/actions/workflows/tests.yml)
[![Publish to PyPI](https://github.com/Leikaab/crudclient/actions/workflows/publish.yml/badge.svg)](https://github.com/Leikaab/crudclient/actions/workflows/publish.yml)
[![Dependabot Updates](https://github.com/Leikaab/crudclient/actions/workflows/dependabot/dependabot-updates/badge.svg)](https://github.com/Leikaab/crudclient/actions/workflows/dependabot/dependabot-updates)
## Project Overview
This project is a foundational framework designed to streamline the creation of API clients and CRUD (Create, Read, Update, Delete) classes. It is intended to be a reusable package that can be implemented in various projects, providing a consistent and DRY (Don't Repeat Yourself) approach to coding.
<details>
<summary>Project details</summary>
### Key Features
- **Authentication**: The framework provides a robust system for handling API authentication, simplifying the integration of secure and efficient authentication methods into your projects.
- **API Construction**: This package offers tools to easily define and structure your API interactions, allowing for dynamic and flexible API client creation that adapts to the specific needs of different projects.
- **CRUD Class Mixins**: The project includes reusable class mixins for building CRUD operations. These mixins promote code reusability and consistency across multiple projects, ensuring that common functionality is implemented efficiently and with minimal duplication.
This framework is designed to help developers focus on implementing the specific logic required for their APIs while relying on a solid, reusable foundation for the underlying infrastructure. It supports a modular approach, making it easier to manage and scale API client development across various projects.
</details>
## Usage
### Response models
<details>
<summary>Setting up a custom response model</summary>
```python
from pydantic import BaseModel
from crudclient.models import ApiResponse
class User(BaseModel):
id: int
name: str
email: str
is_admin: bool
class UsersResponse(ApiResponse[User]):
pass
```
</details>
<details>
<summary>Using response model to create api</summary>
```python
from crudclient.api import API
from crudclient.client import Client, ClientConfig
from crudclient.crud import Crud
from .model import User, UsersResponse
class CustomConfig(ClientConfig):
base_url: str = "https://api.test.myapi.com/v1/"
api_key: str = os.getenv("API_KEY", "")
headers: Optional[Dict[str, str]] = {"my-custom-header": "something"}
timeout: Optional[float] = 10.0
retries: Optional[int] = 3
def auth(self) -> Dict[str, str]:
return {
"x-myapi-api-token": self.api_key,
}
class UsersCrud(Crud[User]):
_resource_path = "users"
_datamodel = User
_api_response_model = UsersResponse
allowed_actions = ["list"]
class OneflowAPI(API):
client_class = Client
def _register_endpoints(self):
self.users = UsersCrud(self.client)
```
</details>
<details>
<summary>Using your api</summary>
```python
from api_example import CustomConfig, OneflowAPI
def main()
config = CustomConfig()
api = OneflowAPI(client_config=config)
users = api.users.list()
assert isinstance(users, UsersResponse)
assert len(users.data) > 0
assert isinstance(users.data[0], User)
assert users.data[0].id is not None
if __name__ == '__main__':
main()
```
</details>
## Logging
The library has standard logging that can be hooked into using get.logger
<details>
<summary>Code example</summary>
```python
import logging
# Use the API library
from crudclient import API
# Configure logging for the application
logging.basicConfig(level=logging.DEBUG)
# Configure specific logging for the crudclient library
logging.getLogger('crudclient').setLevel(logging.INFO)
# Or you could configure at a module level if needed
logging.getLogger('crudclient.api').setLevel(logging.WARNING)
```
</details>
## Project uses devcontainers
This project is set up using devcontainers for easy developement across enviroments and hardware.
### How to run project locally via dev-containers
<details>
<summary>Set up project</summary>
A **development container** is a running [Docker](https://www.docker.com) container with a well-defined tool/runtime stack and its prerequisites.
[![Open in Remote - Containers](https://img.shields.io/static/v1?label=Remote%20-%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/Leikaab/crudclient)
If you already have VS Code and Docker installed, you can click the badge above to automatically install the Remote - Containers extension if needed, clone the source code into a container volume, and spin up a dev container for use.
If this is your first time using a development container, please ensure your system meets the prerequisites (i.e. have Docker installed) in the [getting started steps](https://aka.ms/vscode-remote/containers/getting-started).
</details>
### Test out project
<details>
<summary>Details after setup</summary>
Once you have this project opened, you'll be able to work with it like you would locally.
Note that ha bounch of key extentions are allready installed + there is local project settings set up in the background, even though there is no settings.json file. These settings are made to match with developmental team standards.
> **Note:** This container runs as a non-root user with sudo access by default.
</details>
## Testing and Coverage
This project employs `pytest` for local testing and cd/ci, and also coverage to push to main and for new versions.
<details>
<summary>Testing and Coverage</summary>
This project employs `pytest` as the primary testing framework to ensure the reliability and correctness of the codebase. `pytest` is configured to run comprehensive tests across the project, providing detailed feedback on the results, including which tests pass or fail, and offering powerful tools like fixtures and parameterization to create flexible and scalable tests.
### Coverage with Coverage.py
The project also integrates `coverage.py` to measure code coverage during testing. Code coverage analysis helps identify untested parts of the codebase, ensuring that the tests cover as much of the code as possible. This approach enhances the robustness of the code by verifying that all critical paths and edge cases are tested.
The configuration for `coverage.py` is set up in the `.coveragerc` file, which specifies which parts of the code should be included or omitted from the coverage report. The generated coverage reports provide insights into the percentage of code that is tested, helping to maintain high standards for test completeness.
The setup is optimized for use within the development container, which forwards a custom port (5051) to serve the live coverage reports, making it easy to view and analyze test coverage in real-time.
### Running Tests
To run the tests and generate a coverage report, simply use the following commands within the container:
```bash
pytest --cov=your_package_name --cov-report=html
```
This command will execute all tests and generate an HTML report that you can view in your browser, providing a visual representation of the code coverage.
</details>
## Pre-Commit and Pre-Push Hooks
This project integrates pre-commit and pre-push hooks to ensure that code quality is maintained and that all changes meet the project's standards before they are committed or pushed to the repository. These hooks are configured using the `.pre-commit-config.yaml` file, which specifies the various tools and checks that are automatically run at different stages of the Git workflow.
<details>
<summary>Details on hook rules</summary>
### Pre-Commit Hooks
Pre-commit hooks are executed before each commit is finalized. These hooks ensure that the code adheres to the project's style guidelines and passes initial validation checks. The following tools are configured to run as part of the pre-commit hooks:
- **isort**: Ensures that imports are properly sorted according to the project's style.
- **black**: Formats the code to comply with the `black` code style, with a line length of 120 characters.
- **flake8**: Runs linting checks to identify any potential issues in the code, excluding `setup.py`.
- **mypy**: Performs static type checking to ensure type safety in the codebase.
- **pytest**: Runs the unit tests to verify that the code changes do not break existing functionality.
These tools are configured to run automatically when you attempt to make a commit, helping to catch errors early and maintain a high standard of code quality.
### Pre-Push Hook
The pre-push hook is executed before any changes are pushed to the remote repository. This hook includes an additional layer of testing to ensure that the code meets the required coverage standards:
- **pytest with coverage**: Runs the full test suite with coverage analysis, ensuring that the codebase meets the required coverage threshold (configured to fail if coverage is below 100%).
By enforcing these checks before pushing, the project ensures that all changes are thoroughly validated, reducing the risk of introducing issues into the main codebase.
</details
## Poetry
This project leverages Poetry as the primary tool for dependency management, packaging, versioning, and general project configuration. Poetry is a powerful tool that simplifies the entire lifecycle of a Python project, from development to distribution.
<details>
<summary>Poetry Usage</summary>
### Package Management
Poetry is configured to handle all aspects of package management for this project. It allows you to define dependencies clearly in the `pyproject.toml` file, ensuring that the correct versions of each package are used. Poetry's dependency resolver manages compatibility between packages and installs them in a reproducible environment.
Poetry handles:
- **Dependency Resolution**: Ensuring that all dependencies and their sub-dependencies are compatible and correctly installed.
- **Package Installation**: Installing all required dependencies as defined in the `pyproject.toml` file, ensuring consistency across different environments.
### Publishing to PyPI
We use Poetryto publish packages to PyPI through our CI/CD pipeline with GitHub actions / workflows.
These workflows automate the process of building, packaging, and publishing the package to PyPI, ensuring that the deployment process is consistent and error-free. See chapter CD/CI for more information.
### Versioning
Poetry is used to manage the versioning of the project. Version numbers are specified in the `pyproject.toml` file and can be automatically updated as part of the release process. We follow semantic versioning practices, where version numbers indicate the nature of changes (major, minor, patch) and help maintain backward compatibility.
### Other Uses of Poetry
- **Script Management**: Poetry allows us to define custom scripts that can be run within the project, streamlining repetitive tasks and ensuring consistency across environments.
- **Development Dependencies**: Poetry distinguishes between production and development dependencies, ensuring that only the necessary packages are included in the final distribution, keeping it lightweight and efficient.
- **Environment Configuration**: Although Poetry typically creates a virtual environment (`venv`) for each project, in this setup, we have configured Poetry to avoid creating virtual environments due to our use of development containers. This ensures that dependencies are installed directly into the container environment, simplifying the setup and avoiding potential conflicts.
This configuration is particularly beneficial in a devcontainer environment, where the container itself acts as the isolated development environment, eliminating the need for a separate virtual environment.
</details>
## CI/CD
This project utilizes GitHub Actions to automate continuous integration and continuous deployment (CI/CD) processes. The workflows are designed to ensure code quality, test the development environment, and automatically publish the package to PyPI upon successful testing.
<details>
<summary>Different CI/CD with GitHub Workflows</summary>
### Test Workflow (`tests.yml`)
The `tests.yml` workflow is responsible for running the project's test suite across multiple operating systems (Ubuntu, Windows, and macOS) whenever code is pushed to the repository. This workflow ensures that the codebase is robust and compatible across different environments.
Key steps in this workflow include:
- **Checkout Code**: Retrieves the latest code from the repository.
- **Set up Python**: Configures the appropriate Python environment.
- **Install Dependencies**: Installs the project's dependencies using Poetry.
- **Run Linting and Formatting Checks**: Uses `isort`, `black`, `flake8`, and `mypy` to enforce code quality.
- **Run Tests**: Executes the test suite with `pytest` and checks for 100% code coverage.
This workflow is triggered on every push to the repository, ensuring continuous verification of the code's integrity.
> Add `[skip ci]` to commit message to not run github actions for testing
### Publish Workflow (`publish.yml`)
The `publish.yml` workflow automates the process of publishing the package to PyPI. This workflow is triggered only after the `tests.yml` workflow completes successfully, ensuring that only thoroughly tested code is released.
Key steps in this workflow include:
- **Checkout Code**: Retrieves the full history of the repository, which is necessary for versioning.
- **Set up Python**: Configures the appropriate Python environment.
- **Install Dependencies**: Installs the necessary dependencies without development dependencies.
- **Version Check**: Compares the current version in `pyproject.toml` with the latest Git tag to determine if a new version should be published.
- **Publish to PyPI**: Publishes the package to PyPI using Poetry, making it available for installation via `pip`.
- **Create New Tag**: If a new version is published, the workflow automatically tags the release in the GitHub repository.
This workflow ensures that the package is consistently versioned and available to the public after passing all tests. The workflow only runs if code is pushed to main, and is not touched by versioning that are done in the branches.
### DevContainer Test Workflow (`test_devcontainer.yml`)
The `test_devcontainer.yml` workflow is designed to verify the development container setup, ensuring that other developers can seamlessly use the devcontainer environment.
Key steps in this workflow include:
- **Checkout Code**: Retrieves the latest code from the repository.
- **Set up Docker (for macOS)**: Ensures Docker is running on macOS systems.
- **Set up Devcontainer CLI**: Installs the DevContainer CLI to interact with the development container.
- **Build and Test DevContainer**: Builds the development container and runs basic tests to verify the setup.
- **Validate DevContainer**: Ensures that critical tools like Poetry are correctly installed and configured within the container.
This workflow is triggered whenever changes are made to the `.devcontainer` folder, ensuring that the development environment remains stable and usable. Currently because of limitations in github actions enviroments we are only testing devcontainers on ubuntu through cd/ci. Issues with MacOS or Windows needs to be rapported in the issues section on github.
</details>
## Other
<details>
<summary>Non-functional plans and useful links</summary>
### Bagdes for project
- https://pypi.org/project/pybadges/
- https://github.com/badges/shields
- https://shields.io/badges/dynamic-toml-badge
</details>
Raw data
{
"_id": null,
"home_page": "https://github.com/Leikaab/crudclient",
"name": "crudclient",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": "crud, api, client, rest",
"author": "leikaab",
"author_email": "nordavindltd@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/55/cd/1b0ee842129cdc6c39dd7b70927bc36e505497baf7d3919156af06e333e0/crudclient-0.4.2.tar.gz",
"platform": null,
"description": "# Create base client model for restful libraries\n\n## Badges and quicklinks\n\n### Open project for development in container\n[![Open in Remote - Containers](https://img.shields.io/static/v1?label=Remote%20-%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/Leikaab/crudclient)\n\n### Status of project\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/crudclient)\n![PyPI - Version](https://img.shields.io/pypi/v/crudclient)\n\n### Status of testing\n[![Test DevContainer Build](https://github.com/Leikaab/crudclient/actions/workflows/test_devcontainer.yml/badge.svg)](https://github.com/Leikaab/crudclient/actions/workflows/test_devcontainer.yml)\n[![Run Tests](https://github.com/Leikaab/crudclient/actions/workflows/tests.yml/badge.svg)](https://github.com/Leikaab/crudclient/actions/workflows/tests.yml)\n[![Publish to PyPI](https://github.com/Leikaab/crudclient/actions/workflows/publish.yml/badge.svg)](https://github.com/Leikaab/crudclient/actions/workflows/publish.yml)\n[![Dependabot Updates](https://github.com/Leikaab/crudclient/actions/workflows/dependabot/dependabot-updates/badge.svg)](https://github.com/Leikaab/crudclient/actions/workflows/dependabot/dependabot-updates)\n\n\n## Project Overview\n\n This project is a foundational framework designed to streamline the creation of API clients and CRUD (Create, Read, Update, Delete) classes. It is intended to be a reusable package that can be implemented in various projects, providing a consistent and DRY (Don't Repeat Yourself) approach to coding.\n\n<details>\n <summary>Project details</summary>\n\n ### Key Features\n\n - **Authentication**: The framework provides a robust system for handling API authentication, simplifying the integration of secure and efficient authentication methods into your projects.\n\n - **API Construction**: This package offers tools to easily define and structure your API interactions, allowing for dynamic and flexible API client creation that adapts to the specific needs of different projects.\n\n - **CRUD Class Mixins**: The project includes reusable class mixins for building CRUD operations. These mixins promote code reusability and consistency across multiple projects, ensuring that common functionality is implemented efficiently and with minimal duplication.\n\n This framework is designed to help developers focus on implementing the specific logic required for their APIs while relying on a solid, reusable foundation for the underlying infrastructure. It supports a modular approach, making it easier to manage and scale API client development across various projects.\n\n</details>\n\n## Usage\n\n\n\n### Response models\n\n<details>\n <summary>Setting up a custom response model</summary>\n\n```python\n\nfrom pydantic import BaseModel\nfrom crudclient.models import ApiResponse\n\nclass User(BaseModel):\n\n id: int\n name: str\n email: str\n is_admin: bool\n\n\nclass UsersResponse(ApiResponse[User]):\n pass\n\n```\n\n</details>\n\n<details>\n <summary>Using response model to create api</summary>\n\n```python\n\nfrom crudclient.api import API\nfrom crudclient.client import Client, ClientConfig\nfrom crudclient.crud import Crud\n\nfrom .model import User, UsersResponse\n\n\nclass CustomConfig(ClientConfig):\n base_url: str = \"https://api.test.myapi.com/v1/\"\n api_key: str = os.getenv(\"API_KEY\", \"\")\n headers: Optional[Dict[str, str]] = {\"my-custom-header\": \"something\"}\n timeout: Optional[float] = 10.0\n retries: Optional[int] = 3\n\n def auth(self) -> Dict[str, str]:\n return {\n \"x-myapi-api-token\": self.api_key,\n }\n\n\nclass UsersCrud(Crud[User]):\n _resource_path = \"users\"\n _datamodel = User\n _api_response_model = UsersResponse\n allowed_actions = [\"list\"]\n\n\nclass OneflowAPI(API):\n client_class = Client\n\n def _register_endpoints(self):\n self.users = UsersCrud(self.client)\n\n```\n\n</details>\n\n<details>\n <summary>Using your api</summary>\n\n```python\nfrom api_example import CustomConfig, OneflowAPI\n\ndef main()\n config = CustomConfig()\n api = OneflowAPI(client_config=config)\n users = api.users.list()\n assert isinstance(users, UsersResponse)\n assert len(users.data) > 0\n assert isinstance(users.data[0], User)\n assert users.data[0].id is not None\n\n\nif __name__ == '__main__':\n main()\n\n```\n\n</details>\n\n\n## Logging\n\nThe library has standard logging that can be hooked into using get.logger\n\n<details>\n <summary>Code example</summary>\n\n```python\n\nimport logging\n# Use the API library\nfrom crudclient import API\n\n# Configure logging for the application\nlogging.basicConfig(level=logging.DEBUG)\n\n# Configure specific logging for the crudclient library\nlogging.getLogger('crudclient').setLevel(logging.INFO)\n\n# Or you could configure at a module level if needed\nlogging.getLogger('crudclient.api').setLevel(logging.WARNING)\n\n```\n\n</details>\n\n\n## Project uses devcontainers\nThis project is set up using devcontainers for easy developement across enviroments and hardware.\n\n### How to run project locally via dev-containers\n<details>\n <summary>Set up project</summary>\n\nA **development container** is a running [Docker](https://www.docker.com) container with a well-defined tool/runtime stack and its prerequisites.\n\n[![Open in Remote - Containers](https://img.shields.io/static/v1?label=Remote%20-%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/Leikaab/crudclient)\n\nIf you already have VS Code and Docker installed, you can click the badge above to automatically install the Remote - Containers extension if needed, clone the source code into a container volume, and spin up a dev container for use.\n\nIf this is your first time using a development container, please ensure your system meets the prerequisites (i.e. have Docker installed) in the [getting started steps](https://aka.ms/vscode-remote/containers/getting-started).\n</details>\n\n### Test out project\n\n<details>\n <summary>Details after setup</summary>\nOnce you have this project opened, you'll be able to work with it like you would locally.\n\nNote that ha bounch of key extentions are allready installed + there is local project settings set up in the background, even though there is no settings.json file. These settings are made to match with developmental team standards.\n\n> **Note:** This container runs as a non-root user with sudo access by default.\n\n</details>\n\n## Testing and Coverage\n\nThis project employs `pytest` for local testing and cd/ci, and also coverage to push to main and for new versions.\n\n<details>\n <summary>Testing and Coverage</summary>\n\n\n This project employs `pytest` as the primary testing framework to ensure the reliability and correctness of the codebase. `pytest` is configured to run comprehensive tests across the project, providing detailed feedback on the results, including which tests pass or fail, and offering powerful tools like fixtures and parameterization to create flexible and scalable tests.\n\n ### Coverage with Coverage.py\n\n The project also integrates `coverage.py` to measure code coverage during testing. Code coverage analysis helps identify untested parts of the codebase, ensuring that the tests cover as much of the code as possible. This approach enhances the robustness of the code by verifying that all critical paths and edge cases are tested.\n\n The configuration for `coverage.py` is set up in the `.coveragerc` file, which specifies which parts of the code should be included or omitted from the coverage report. The generated coverage reports provide insights into the percentage of code that is tested, helping to maintain high standards for test completeness.\n\n The setup is optimized for use within the development container, which forwards a custom port (5051) to serve the live coverage reports, making it easy to view and analyze test coverage in real-time.\n\n ### Running Tests\n\n To run the tests and generate a coverage report, simply use the following commands within the container:\n\n ```bash\n pytest --cov=your_package_name --cov-report=html\n ```\n\n This command will execute all tests and generate an HTML report that you can view in your browser, providing a visual representation of the code coverage.\n\n</details>\n\n## Pre-Commit and Pre-Push Hooks\n\n This project integrates pre-commit and pre-push hooks to ensure that code quality is maintained and that all changes meet the project's standards before they are committed or pushed to the repository. These hooks are configured using the `.pre-commit-config.yaml` file, which specifies the various tools and checks that are automatically run at different stages of the Git workflow.\n\n<details>\n <summary>Details on hook rules</summary>\n\n ### Pre-Commit Hooks\n\n Pre-commit hooks are executed before each commit is finalized. These hooks ensure that the code adheres to the project's style guidelines and passes initial validation checks. The following tools are configured to run as part of the pre-commit hooks:\n\n - **isort**: Ensures that imports are properly sorted according to the project's style.\n - **black**: Formats the code to comply with the `black` code style, with a line length of 120 characters.\n - **flake8**: Runs linting checks to identify any potential issues in the code, excluding `setup.py`.\n - **mypy**: Performs static type checking to ensure type safety in the codebase.\n - **pytest**: Runs the unit tests to verify that the code changes do not break existing functionality.\n\n These tools are configured to run automatically when you attempt to make a commit, helping to catch errors early and maintain a high standard of code quality.\n\n ### Pre-Push Hook\n\n The pre-push hook is executed before any changes are pushed to the remote repository. This hook includes an additional layer of testing to ensure that the code meets the required coverage standards:\n\n - **pytest with coverage**: Runs the full test suite with coverage analysis, ensuring that the codebase meets the required coverage threshold (configured to fail if coverage is below 100%).\n\n By enforcing these checks before pushing, the project ensures that all changes are thoroughly validated, reducing the risk of introducing issues into the main codebase.\n\n</details\n\n## Poetry\n\n This project leverages Poetry as the primary tool for dependency management, packaging, versioning, and general project configuration. Poetry is a powerful tool that simplifies the entire lifecycle of a Python project, from development to distribution.\n\n<details>\n <summary>Poetry Usage</summary>\n\n ### Package Management\n\n Poetry is configured to handle all aspects of package management for this project. It allows you to define dependencies clearly in the `pyproject.toml` file, ensuring that the correct versions of each package are used. Poetry's dependency resolver manages compatibility between packages and installs them in a reproducible environment.\n\n Poetry handles:\n\n - **Dependency Resolution**: Ensuring that all dependencies and their sub-dependencies are compatible and correctly installed.\n - **Package Installation**: Installing all required dependencies as defined in the `pyproject.toml` file, ensuring consistency across different environments.\n\n ### Publishing to PyPI\n\n We use Poetryto publish packages to PyPI through our CI/CD pipeline with GitHub actions / workflows.\n These workflows automate the process of building, packaging, and publishing the package to PyPI, ensuring that the deployment process is consistent and error-free. See chapter CD/CI for more information.\n\n ### Versioning\n\n Poetry is used to manage the versioning of the project. Version numbers are specified in the `pyproject.toml` file and can be automatically updated as part of the release process. We follow semantic versioning practices, where version numbers indicate the nature of changes (major, minor, patch) and help maintain backward compatibility.\n\n ### Other Uses of Poetry\n\n - **Script Management**: Poetry allows us to define custom scripts that can be run within the project, streamlining repetitive tasks and ensuring consistency across environments.\n\n - **Development Dependencies**: Poetry distinguishes between production and development dependencies, ensuring that only the necessary packages are included in the final distribution, keeping it lightweight and efficient.\n\n - **Environment Configuration**: Although Poetry typically creates a virtual environment (`venv`) for each project, in this setup, we have configured Poetry to avoid creating virtual environments due to our use of development containers. This ensures that dependencies are installed directly into the container environment, simplifying the setup and avoiding potential conflicts.\n\n This configuration is particularly beneficial in a devcontainer environment, where the container itself acts as the isolated development environment, eliminating the need for a separate virtual environment.\n\n</details>\n\n## CI/CD\n\n This project utilizes GitHub Actions to automate continuous integration and continuous deployment (CI/CD) processes. The workflows are designed to ensure code quality, test the development environment, and automatically publish the package to PyPI upon successful testing.\n\n<details>\n <summary>Different CI/CD with GitHub Workflows</summary>\n\n ### Test Workflow (`tests.yml`)\n\n The `tests.yml` workflow is responsible for running the project's test suite across multiple operating systems (Ubuntu, Windows, and macOS) whenever code is pushed to the repository. This workflow ensures that the codebase is robust and compatible across different environments.\n\n Key steps in this workflow include:\n - **Checkout Code**: Retrieves the latest code from the repository.\n - **Set up Python**: Configures the appropriate Python environment.\n - **Install Dependencies**: Installs the project's dependencies using Poetry.\n - **Run Linting and Formatting Checks**: Uses `isort`, `black`, `flake8`, and `mypy` to enforce code quality.\n - **Run Tests**: Executes the test suite with `pytest` and checks for 100% code coverage.\n\n This workflow is triggered on every push to the repository, ensuring continuous verification of the code's integrity.\n\n > Add `[skip ci]` to commit message to not run github actions for testing\n\n ### Publish Workflow (`publish.yml`)\n\n The `publish.yml` workflow automates the process of publishing the package to PyPI. This workflow is triggered only after the `tests.yml` workflow completes successfully, ensuring that only thoroughly tested code is released.\n\n Key steps in this workflow include:\n - **Checkout Code**: Retrieves the full history of the repository, which is necessary for versioning.\n - **Set up Python**: Configures the appropriate Python environment.\n - **Install Dependencies**: Installs the necessary dependencies without development dependencies.\n - **Version Check**: Compares the current version in `pyproject.toml` with the latest Git tag to determine if a new version should be published.\n - **Publish to PyPI**: Publishes the package to PyPI using Poetry, making it available for installation via `pip`.\n - **Create New Tag**: If a new version is published, the workflow automatically tags the release in the GitHub repository.\n\n This workflow ensures that the package is consistently versioned and available to the public after passing all tests. The workflow only runs if code is pushed to main, and is not touched by versioning that are done in the branches.\n\n ### DevContainer Test Workflow (`test_devcontainer.yml`)\n\n The `test_devcontainer.yml` workflow is designed to verify the development container setup, ensuring that other developers can seamlessly use the devcontainer environment.\n\n Key steps in this workflow include:\n - **Checkout Code**: Retrieves the latest code from the repository.\n - **Set up Docker (for macOS)**: Ensures Docker is running on macOS systems.\n - **Set up Devcontainer CLI**: Installs the DevContainer CLI to interact with the development container.\n - **Build and Test DevContainer**: Builds the development container and runs basic tests to verify the setup.\n - **Validate DevContainer**: Ensures that critical tools like Poetry are correctly installed and configured within the container.\n\n This workflow is triggered whenever changes are made to the `.devcontainer` folder, ensuring that the development environment remains stable and usable. Currently because of limitations in github actions enviroments we are only testing devcontainers on ubuntu through cd/ci. Issues with MacOS or Windows needs to be rapported in the issues section on github.\n\n</details>\n\n## Other\n\n<details>\n <summary>Non-functional plans and useful links</summary>\n\n ### Bagdes for project\n\n - https://pypi.org/project/pybadges/\n - https://github.com/badges/shields\n - https://shields.io/badges/dynamic-toml-badge\n\n</details>\n",
"bugtrack_url": null,
"license": "LGPL-3.0-or-later",
"summary": "A flexible CRUD client for RESTful APIs",
"version": "0.4.2",
"project_urls": {
"Bug Tracker": "https://github.com/Leikaab/crudclient/issues",
"Documentation": "https://github.com/Leikaab/crudclient#readme",
"Homepage": "https://github.com/Leikaab/crudclient",
"Repository": "https://github.com/Leikaab/crudclient"
},
"split_keywords": [
"crud",
" api",
" client",
" rest"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "b0e445096e69281ac5e8c2111e42fc6a7a6b6dcad4d40ffe1a772d27ea3b68d3",
"md5": "7f73aaa19294cbfc2b3ae6fb151be5e2",
"sha256": "908aeb6cf07871767239fd4073c2facc0fc898f8a01adcf8c7fc633049efc830"
},
"downloads": -1,
"filename": "crudclient-0.4.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7f73aaa19294cbfc2b3ae6fb151be5e2",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 35932,
"upload_time": "2024-09-03T02:16:35",
"upload_time_iso_8601": "2024-09-03T02:16:35.326750Z",
"url": "https://files.pythonhosted.org/packages/b0/e4/45096e69281ac5e8c2111e42fc6a7a6b6dcad4d40ffe1a772d27ea3b68d3/crudclient-0.4.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "55cd1b0ee842129cdc6c39dd7b70927bc36e505497baf7d3919156af06e333e0",
"md5": "d901660fbe84896b7ae41a8c1d745079",
"sha256": "1fbe4c7ba9eef0ab404cdde975ad5d25a97887a886e486fb60abf122d7ec4d31"
},
"downloads": -1,
"filename": "crudclient-0.4.2.tar.gz",
"has_sig": false,
"md5_digest": "d901660fbe84896b7ae41a8c1d745079",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 36558,
"upload_time": "2024-09-03T02:16:36",
"upload_time_iso_8601": "2024-09-03T02:16:36.331999Z",
"url": "https://files.pythonhosted.org/packages/55/cd/1b0ee842129cdc6c39dd7b70927bc36e505497baf7d3919156af06e333e0/crudclient-0.4.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-03 02:16:36",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Leikaab",
"github_project": "crudclient",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"lcname": "crudclient"
}