kafka-client-decorator


Namekafka-client-decorator JSON
Version 1.7 PyPI version JSON
download
home_pagehttps://www.quantrium.ai/
SummaryA wrapper for confluent-kafka producer and consumer
upload_time2023-12-12 05:21:16
maintainer
docs_urlNone
authorQuantrium
requires_python>=3.6
licenseQuantrium PVT LTD
keywords confluent-kafka kafka-producer kafka-consumer
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            A wrapper for kafka producer and consumer that can be used as decorator for a function which can keep consuming data, process this data and broadcast it to next topics/queues.

This uses [confluent-kafka](https://pypi.org/project/confluent-kafka/) python package to create prooducer, consumer and then wraps it. So, big thanks to them!

## Installation
```
$ pip install kafka-client-decorator
```

## Usage
Define your function how you want to process the data and then decorate it.
```
from kafka_client_decorator.kafka_client import KafkaClient

@KafkaClient(bootstrap_servers, security_protocol, sasl_username, sasl_password).consumer_producer(consumer_from_topic='my-topic-1', group_id='pdf', produce_to_topic=['my-topic-2'])
def process_data(data = None):
    # Call your driver modules here to process the data
    result = Driver(data)
    return result
```

> **_NOTE:_**  If you want the your driver result to be pushed to next topic/queue, you can simply pass produce_to_topic as arg in decorator 'consumer_prodcuer' method.

To only produce to topic(s) -
```
from kafka_client_decorator.client_producer import ClientProducer

producer = ClientProducer(bootstrap_servers, security_protocol, sasl_username, sasl_password)
prodcuer.produce_to_broker(data, topics_list)
```
> **_NOTE:_** If your kafka broker does not uses SASL or SSL protocol, no need to pass 'sasl_username' and 'sasl_password'.

> **_NOTE:_** If you want to work with multiple partitions in kafka, you can use below method to produce (it provides custom unique key to be sent with the message)

```
prodcuer.produce_to_broker_with_key(data, topic_list)
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://www.quantrium.ai/",
    "name": "kafka-client-decorator",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "confluent-kafka,Kafka-producer,Kafka-consumer",
    "author": "Quantrium",
    "author_email": "firoz.mohammad@quantrium.ai",
    "download_url": "https://files.pythonhosted.org/packages/60/9a/f3ef48d8464ff021ae2ddb775bf481497b67e9a76cdd4004065a53db87f6/kafka-client-decorator-1.7.tar.gz",
    "platform": null,
    "description": "A wrapper for kafka producer and consumer that can be used as decorator for a function which can keep consuming data, process this data and broadcast it to next topics/queues.\n\nThis uses [confluent-kafka](https://pypi.org/project/confluent-kafka/) python package to create prooducer, consumer and then wraps it. So, big thanks to them!\n\n## Installation\n```\n$ pip install kafka-client-decorator\n```\n\n## Usage\nDefine your function how you want to process the data and then decorate it.\n```\nfrom kafka_client_decorator.kafka_client import KafkaClient\n\n@KafkaClient(bootstrap_servers, security_protocol, sasl_username, sasl_password).consumer_producer(consumer_from_topic='my-topic-1', group_id='pdf', produce_to_topic=['my-topic-2'])\ndef process_data(data = None):\n    # Call your driver modules here to process the data\n    result = Driver(data)\n    return result\n```\n\n> **_NOTE:_**  If you want the your driver result to be pushed to next topic/queue, you can simply pass produce_to_topic as arg in decorator 'consumer_prodcuer' method.\n\nTo only produce to topic(s) -\n```\nfrom kafka_client_decorator.client_producer import ClientProducer\n\nproducer = ClientProducer(bootstrap_servers, security_protocol, sasl_username, sasl_password)\nprodcuer.produce_to_broker(data, topics_list)\n```\n> **_NOTE:_** If your kafka broker does not uses SASL or SSL protocol, no need to pass 'sasl_username' and 'sasl_password'.\n\n> **_NOTE:_** If you want to work with multiple partitions in kafka, you can use below method to produce (it provides custom unique key to be sent with the message)\n\n```\nprodcuer.produce_to_broker_with_key(data, topic_list)\n```\n",
    "bugtrack_url": null,
    "license": "Quantrium PVT LTD",
    "summary": "A wrapper for confluent-kafka producer and consumer",
    "version": "1.7",
    "project_urls": {
        "Homepage": "https://www.quantrium.ai/"
    },
    "split_keywords": [
        "confluent-kafka",
        "kafka-producer",
        "kafka-consumer"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "aa75e5c96038d853d9f32991188ccd54376521386829522aa5994ccdb91f7c88",
                "md5": "4f4b37ca82301024409c50bbfd55340f",
                "sha256": "58bf112731319d71dd9f3d81fa2574d64d808d3b24be5f166ffcbef910cd70fe"
            },
            "downloads": -1,
            "filename": "kafka_client_decorator-1.7-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4f4b37ca82301024409c50bbfd55340f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 6202,
            "upload_time": "2023-12-12T05:21:14",
            "upload_time_iso_8601": "2023-12-12T05:21:14.613434Z",
            "url": "https://files.pythonhosted.org/packages/aa/75/e5c96038d853d9f32991188ccd54376521386829522aa5994ccdb91f7c88/kafka_client_decorator-1.7-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "609af3ef48d8464ff021ae2ddb775bf481497b67e9a76cdd4004065a53db87f6",
                "md5": "240cfde8112ee781afaa39515690e412",
                "sha256": "79d2324d5a5f2dbd9d728cdde12d54aa1fe4c1f8d261ea667ded7ec78f232e47"
            },
            "downloads": -1,
            "filename": "kafka-client-decorator-1.7.tar.gz",
            "has_sig": false,
            "md5_digest": "240cfde8112ee781afaa39515690e412",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 5184,
            "upload_time": "2023-12-12T05:21:16",
            "upload_time_iso_8601": "2023-12-12T05:21:16.442507Z",
            "url": "https://files.pythonhosted.org/packages/60/9a/f3ef48d8464ff021ae2ddb775bf481497b67e9a76cdd4004065a53db87f6/kafka-client-decorator-1.7.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-12 05:21:16",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "kafka-client-decorator"
}
        
Elapsed time: 0.17894s