flama


Nameflama JSON
Version 1.8.1 PyPI version JSON
download
home_pagehttps://flama.dev
SummaryFire up your models with the flame 馃敟
upload_time2024-11-06 19:58:23
maintainerJos茅 Antonio Perdiguero L贸pez
docs_urlNone
authorJos茅 Antonio Perdiguero L贸pez
requires_python<3.14,>=3.9
licenseMIT
keywords machine-learning ml ml-ops mlops api rest restful openapi tensorflow pytorch sklearn
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
    <a href="https://flama.dev"><img src="https://raw.githubusercontent.com/vortico/flama/master/.github/logo.png" alt='Flama'></a>
</p>
<p align="center">
    <em>Fire up your models with the flame</em> &#128293;
</p>
<p align="center">
    <a href="https://github.com/vortico/flama/actions">
        <img src="https://github.com/vortico/flama/workflows/Test%20And%20Publish/badge.svg" alt="Test And Publish workflow status">
    </a>
    <a href="https://pypi.org/project/flama/">
        <img src="https://img.shields.io/pypi/v/flama?logo=PyPI&logoColor=white" alt="Package version">
    </a>
    <a href="https://pypi.org/project/flama/">
        <img src="https://img.shields.io/pypi/pyversions/flama?logo=Python&logoColor=white" alt="PyPI - Python Version">
    </a>
</p>

---

# Flama

Flama is a python library which establishes a standard framework for
development and deployment of APIs with special focus on machine learning (ML).
The main aim of the framework is to make ridiculously simple the deployment of
ML APIs, simplifying (when possible) the entire process to a single line of
code.

The library builds on Starlette, and provides an easy-to-learn
philosophy to speed up the building of highly performant GraphQL, REST and ML APIs.
Besides, it comprises an ideal solution for the development of asynchronous
and production-ready services, offering automatic deployment for ML models.

Some remarkable characteristics:

* Generic classes for API resources with the convenience of standard CRUD methods over SQLAlchemy tables.
* A schema system (based on Marshmallow or Typesystem) which allows the declaration of inputs and outputs of endpoints
  very easily, with the convenience of reliable and automatic data-type validation.
* Dependency injection to make ease the process of managing parameters needed in endpoints via the use of `Component`s.
  Flama ASGI objects like `Request`, `Response`, `Session` and so on are defined as `Component`s ready to be injected in
  your endpoints.
* `Component`s as the base of the plugin ecosystem, allowing you to create custom or use those already defined in your
  endpoints, injected as parameters.
* Auto generated API schema using OpenAPI standard.
* Auto generated `docs`, and provides a Swagger UI and ReDoc endpoints.
* Automatic handling of pagination, with several methods at your disposal such as `limit-offset` and `page numbering`,
  to name a few.

## Installation

Flama is fully compatible with all [supported versions](https://devguide.python.org/versions/) of Python. We recommend
you to use the latest version available.

For a detailed explanation on how to install flama
visit:  [https://flama.dev/docs/getting-started/installation](https://flama.dev/docs/getting-started/installation).

## Getting Started

Visit [https://flama.dev/docs/getting-started/quickstart](https://flama.dev/docs/getting-started/quickstart) to get
started with Flama.

## Documentation

Visit [https://flama.dev/docs/](https://flama.dev/docs/) to view the full documentation.

## Example

```python
from flama import Flama

app = Flama(
    title="Hello-馃敟",
    version="1.0",
    description="My first API",
)


@app.route("/")
def home():
    """
    tags:
        - Salute
    summary:
        Returns a warming message
    description:
        This is a more detailed description of the method itself.
        Here we can give all the details required and they will appear
        automatically in the auto-generated docs.
    responses:
        200:
            description: Warming hello message!
    """
    return {"message": "Hello 馃敟"}
```

This example will build and run a `Hello 馃敟` API. To run it:

```commandline
flama run examples.hello_flama:app
```

## Authors

* Jos茅 Antonio Perdiguero L贸pez ([@perdy](https://github.com/perdy/))
* Miguel Dur谩n-Olivencia ([@migduroli](https://github.com/migduroli/))

## Contributing

This project is absolutely open to contributions so if you have a nice idea, please read
our [contributing docs](.github/CONTRIBUTING.md) **before submitting** a pull
request.

## Star History

<a href="https://github.com/vortico/flama">
  <picture>
    <source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=vortico/flama&type=Date&theme=dark" />
    <source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=vortico/flama&type=Date" />
    <img alt="Star History Chart" src="https://api.star-history.com/svg?repos=vortico/flama&type=Date" />
  </picture>
</a>

            

Raw data

            {
    "_id": null,
    "home_page": "https://flama.dev",
    "name": "flama",
    "maintainer": "Jos\u00e9 Antonio Perdiguero L\u00f3pez",
    "docs_url": null,
    "requires_python": "<3.14,>=3.9",
    "maintainer_email": "perdy@perdy.io",
    "keywords": "machine-learning, ml, ml-ops, mlops, api, rest, restful, openapi, tensorflow, pytorch, sklearn",
    "author": "Jos\u00e9 Antonio Perdiguero L\u00f3pez",
    "author_email": "perdy@perdy.io",
    "download_url": "https://files.pythonhosted.org/packages/9f/77/03db4295fe37ae7e366b8a2e44f2285b25be17d0094d54fa2d9ba145e179/flama-1.8.1.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n    <a href=\"https://flama.dev\"><img src=\"https://raw.githubusercontent.com/vortico/flama/master/.github/logo.png\" alt='Flama'></a>\n</p>\n<p align=\"center\">\n    <em>Fire up your models with the flame</em> &#128293;\n</p>\n<p align=\"center\">\n    <a href=\"https://github.com/vortico/flama/actions\">\n        <img src=\"https://github.com/vortico/flama/workflows/Test%20And%20Publish/badge.svg\" alt=\"Test And Publish workflow status\">\n    </a>\n    <a href=\"https://pypi.org/project/flama/\">\n        <img src=\"https://img.shields.io/pypi/v/flama?logo=PyPI&logoColor=white\" alt=\"Package version\">\n    </a>\n    <a href=\"https://pypi.org/project/flama/\">\n        <img src=\"https://img.shields.io/pypi/pyversions/flama?logo=Python&logoColor=white\" alt=\"PyPI - Python Version\">\n    </a>\n</p>\n\n---\n\n# Flama\n\nFlama is a python library which establishes a standard framework for\ndevelopment and deployment of APIs with special focus on machine learning (ML).\nThe main aim of the framework is to make ridiculously simple the deployment of\nML APIs, simplifying (when possible) the entire process to a single line of\ncode.\n\nThe library builds on Starlette, and provides an easy-to-learn\nphilosophy to speed up the building of highly performant GraphQL, REST and ML APIs.\nBesides, it comprises an ideal solution for the development of asynchronous\nand production-ready services, offering automatic deployment for ML models.\n\nSome remarkable characteristics:\n\n* Generic classes for API resources with the convenience of standard CRUD methods over SQLAlchemy tables.\n* A schema system (based on Marshmallow or Typesystem) which allows the declaration of inputs and outputs of endpoints\n  very easily, with the convenience of reliable and automatic data-type validation.\n* Dependency injection to make ease the process of managing parameters needed in endpoints via the use of `Component`s.\n  Flama ASGI objects like `Request`, `Response`, `Session` and so on are defined as `Component`s ready to be injected in\n  your endpoints.\n* `Component`s as the base of the plugin ecosystem, allowing you to create custom or use those already defined in your\n  endpoints, injected as parameters.\n* Auto generated API schema using OpenAPI standard.\n* Auto generated `docs`, and provides a Swagger UI and ReDoc endpoints.\n* Automatic handling of pagination, with several methods at your disposal such as `limit-offset` and `page numbering`,\n  to name a few.\n\n## Installation\n\nFlama is fully compatible with all [supported versions](https://devguide.python.org/versions/) of Python. We recommend\nyou to use the latest version available.\n\nFor a detailed explanation on how to install flama\nvisit:  [https://flama.dev/docs/getting-started/installation](https://flama.dev/docs/getting-started/installation).\n\n## Getting Started\n\nVisit [https://flama.dev/docs/getting-started/quickstart](https://flama.dev/docs/getting-started/quickstart) to get\nstarted with Flama.\n\n## Documentation\n\nVisit [https://flama.dev/docs/](https://flama.dev/docs/) to view the full documentation.\n\n## Example\n\n```python\nfrom flama import Flama\n\napp = Flama(\n    title=\"Hello-\ud83d\udd25\",\n    version=\"1.0\",\n    description=\"My first API\",\n)\n\n\n@app.route(\"/\")\ndef home():\n    \"\"\"\n    tags:\n        - Salute\n    summary:\n        Returns a warming message\n    description:\n        This is a more detailed description of the method itself.\n        Here we can give all the details required and they will appear\n        automatically in the auto-generated docs.\n    responses:\n        200:\n            description: Warming hello message!\n    \"\"\"\n    return {\"message\": \"Hello \ud83d\udd25\"}\n```\n\nThis example will build and run a `Hello \ud83d\udd25` API. To run it:\n\n```commandline\nflama run examples.hello_flama:app\n```\n\n## Authors\n\n* Jos\u00e9 Antonio Perdiguero L\u00f3pez ([@perdy](https://github.com/perdy/))\n* Miguel Dur\u00e1n-Olivencia ([@migduroli](https://github.com/migduroli/))\n\n## Contributing\n\nThis project is absolutely open to contributions so if you have a nice idea, please read\nour [contributing docs](.github/CONTRIBUTING.md) **before submitting** a pull\nrequest.\n\n## Star History\n\n<a href=\"https://github.com/vortico/flama\">\n  <picture>\n    <source media=\"(prefers-color-scheme: dark)\" srcset=\"https://api.star-history.com/svg?repos=vortico/flama&type=Date&theme=dark\" />\n    <source media=\"(prefers-color-scheme: light)\" srcset=\"https://api.star-history.com/svg?repos=vortico/flama&type=Date\" />\n    <img alt=\"Star History Chart\" src=\"https://api.star-history.com/svg?repos=vortico/flama&type=Date\" />\n  </picture>\n</a>\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Fire up your models with the flame \ud83d\udd25",
    "version": "1.8.1",
    "project_urls": {
        "Documentation": "https://flama.dev/docs/",
        "Homepage": "https://flama.dev",
        "Repository": "https://github.com/vortico/flama"
    },
    "split_keywords": [
        "machine-learning",
        " ml",
        " ml-ops",
        " mlops",
        " api",
        " rest",
        " restful",
        " openapi",
        " tensorflow",
        " pytorch",
        " sklearn"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e8917392ec7c026c338c8863b7b99e4a1d393226be097fceccaaf1ebdaf2398b",
                "md5": "356f02fe7080ff396f7b1761908cf81c",
                "sha256": "5c8b316f7cac161e03d6e41f258596804112bd7230e15974a0fd3b04d526b5c1"
            },
            "downloads": -1,
            "filename": "flama-1.8.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "356f02fe7080ff396f7b1761908cf81c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.14,>=3.9",
            "size": 345356,
            "upload_time": "2024-11-06T19:58:21",
            "upload_time_iso_8601": "2024-11-06T19:58:21.676081Z",
            "url": "https://files.pythonhosted.org/packages/e8/91/7392ec7c026c338c8863b7b99e4a1d393226be097fceccaaf1ebdaf2398b/flama-1.8.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9f7703db4295fe37ae7e366b8a2e44f2285b25be17d0094d54fa2d9ba145e179",
                "md5": "3a465bec099b9201f74565a57a7ecf09",
                "sha256": "4d5c3ff129868864ab426a313a05e22fb129347ecc40cf3f585677b73059025c"
            },
            "downloads": -1,
            "filename": "flama-1.8.1.tar.gz",
            "has_sig": false,
            "md5_digest": "3a465bec099b9201f74565a57a7ecf09",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.14,>=3.9",
            "size": 298850,
            "upload_time": "2024-11-06T19:58:23",
            "upload_time_iso_8601": "2024-11-06T19:58:23.333605Z",
            "url": "https://files.pythonhosted.org/packages/9f/77/03db4295fe37ae7e366b8a2e44f2285b25be17d0094d54fa2d9ba145e179/flama-1.8.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-06 19:58:23",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "vortico",
    "github_project": "flama",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "flama"
}
        
Elapsed time: 0.63420s