aiosafeconsumer


Nameaiosafeconsumer JSON
Version 0.0.5 PyPI version JSON
download
home_pageNone
SummarySafely consume and process data.
upload_time2025-02-02 13:23:35
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords asyncio consumer microservices
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            aiosafeconsumer
===============

.. image:: https://github.com/lostclus/aiosafeconsumer/actions/workflows/tests.yml/badge.svg
    :target: https://github.com/lostclus/aiosafeconsumer/actions

.. image:: https://img.shields.io/pypi/v/aiosafeconsumer.svg
    :target: https://pypi.org/project/aiosafeconsumer/
    :alt: Current version on PyPi

.. image:: https://img.shields.io/pypi/pyversions/aiosafeconsumer
    :alt: PyPI - Python Version

aiosafeconsumer is a library that provides abstractions and some implementations
to consume data somewhere and process it.

Features:

* Based on AsyncIO
* Type annotated
* Use logging with contextual information

Abstractions:

* `DataSource` - waits for data and returns batch of records using Python generator
* `DataProcessor` - accepts batch of records and precess it
* `DataTransformer` - accepts batch of records and transform it and calls
  another processor to precess it. Extends `DataProcessor`
* `Worker` - abstract worker. Do a long running task
* `ConsumerWorker` - connects `DataSource` and `DataProcessor`. Extends Worker
* `DataWriter` - base abstraction to perform data synchronization. Extends DataProcessor

Current implementations:

* `KafkaSource` - read data from Kafka
* `RedisWriter` - synchronize data in Redis
* `ElasticsearchWriter` - synchronize data in Elasticsearch
* `WorkerPool` - controller to setup and run workers in parallel. Can handle worker failures and restarts workers when it fails or exits.

Install::

    pip install aiosafeconsumer

Install with Kafka::

    pip install aiosafeconsumer[kafka]

Install with Redis::

    pip install aiosafeconsumer[redis]

Install with Elasticsearch::

    pip install aiosafeconsumer[elasticsearch]

Install with MongoDB::

    pip install aiosafeconsumer[mongo]

Links:

* Producer library: https://github.com/lostclus/django-kafka-streamer
* Example application: https://github.com/lostclus/WeatherApp

TODO:

* PostgreSQL writer
* ClickHouse writer

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "aiosafeconsumer",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "asyncio, consumer, microservices",
    "author": null,
    "author_email": "Kostiantyn Korikov <lostclus@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/3d/9d/1d34f3ae3dc5be3106953c2fb42672eaa577e42e477b2206774149526fcf/aiosafeconsumer-0.0.5.tar.gz",
    "platform": null,
    "description": "aiosafeconsumer\n===============\n\n.. image:: https://github.com/lostclus/aiosafeconsumer/actions/workflows/tests.yml/badge.svg\n    :target: https://github.com/lostclus/aiosafeconsumer/actions\n\n.. image:: https://img.shields.io/pypi/v/aiosafeconsumer.svg\n    :target: https://pypi.org/project/aiosafeconsumer/\n    :alt: Current version on PyPi\n\n.. image:: https://img.shields.io/pypi/pyversions/aiosafeconsumer\n    :alt: PyPI - Python Version\n\naiosafeconsumer is a library that provides abstractions and some implementations\nto consume data somewhere and process it.\n\nFeatures:\n\n* Based on AsyncIO\n* Type annotated\n* Use logging with contextual information\n\nAbstractions:\n\n* `DataSource` - waits for data and returns batch of records using Python generator\n* `DataProcessor` - accepts batch of records and precess it\n* `DataTransformer` - accepts batch of records and transform it and calls\n  another processor to precess it. Extends `DataProcessor`\n* `Worker` - abstract worker. Do a long running task\n* `ConsumerWorker` - connects `DataSource` and `DataProcessor`. Extends Worker\n* `DataWriter` - base abstraction to perform data synchronization. Extends DataProcessor\n\nCurrent implementations:\n\n* `KafkaSource` - read data from Kafka\n* `RedisWriter` - synchronize data in Redis\n* `ElasticsearchWriter` - synchronize data in Elasticsearch\n* `WorkerPool` - controller to setup and run workers in parallel. Can handle worker failures and restarts workers when it fails or exits.\n\nInstall::\n\n    pip install aiosafeconsumer\n\nInstall with Kafka::\n\n    pip install aiosafeconsumer[kafka]\n\nInstall with Redis::\n\n    pip install aiosafeconsumer[redis]\n\nInstall with Elasticsearch::\n\n    pip install aiosafeconsumer[elasticsearch]\n\nInstall with MongoDB::\n\n    pip install aiosafeconsumer[mongo]\n\nLinks:\n\n* Producer library: https://github.com/lostclus/django-kafka-streamer\n* Example application: https://github.com/lostclus/WeatherApp\n\nTODO:\n\n* PostgreSQL writer\n* ClickHouse writer\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Safely consume and process data.",
    "version": "0.0.5",
    "project_urls": {
        "Repository": "http://github.com/lostclus/aiosafeconsumer"
    },
    "split_keywords": [
        "asyncio",
        " consumer",
        " microservices"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "76d7194244afa1e942032aa5b6e726a1e4805ce0947f9ba548976e21038e51ab",
                "md5": "f4e6db72068c8653d3d0da92dcaa49b0",
                "sha256": "80956f33507bf3fe6c247b90125a159b2bc883a39c87e09b69d7c8097fde45fd"
            },
            "downloads": -1,
            "filename": "aiosafeconsumer-0.0.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f4e6db72068c8653d3d0da92dcaa49b0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 21742,
            "upload_time": "2025-02-02T13:23:33",
            "upload_time_iso_8601": "2025-02-02T13:23:33.654356Z",
            "url": "https://files.pythonhosted.org/packages/76/d7/194244afa1e942032aa5b6e726a1e4805ce0947f9ba548976e21038e51ab/aiosafeconsumer-0.0.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "3d9d1d34f3ae3dc5be3106953c2fb42672eaa577e42e477b2206774149526fcf",
                "md5": "05f2b8af7ab8ed5c64fd71cd1392aa6e",
                "sha256": "a5cefd3ee5a17b61782b38e574953ade859749040a045e43fd7866c99f1f223b"
            },
            "downloads": -1,
            "filename": "aiosafeconsumer-0.0.5.tar.gz",
            "has_sig": false,
            "md5_digest": "05f2b8af7ab8ed5c64fd71cd1392aa6e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 34653,
            "upload_time": "2025-02-02T13:23:35",
            "upload_time_iso_8601": "2025-02-02T13:23:35.848679Z",
            "url": "https://files.pythonhosted.org/packages/3d/9d/1d34f3ae3dc5be3106953c2fb42672eaa577e42e477b2206774149526fcf/aiosafeconsumer-0.0.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-02 13:23:35",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "lostclus",
    "github_project": "aiosafeconsumer",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "aiosafeconsumer"
}
        
Elapsed time: 2.20749s