kafkian


Namekafkian JSON
Version 0.15.0 PyPI version JSON
download
home_pagehttps://github.com/saabeilin/kafkian
SummaryOpinionated Kafka Python client on top of Confluent python library
upload_time2024-02-09 15:20:10
maintainer
docs_urlNone
authorSergei Beilin
requires_python
licenseApache 2.0
keywords kafka
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI
coveralls test coverage No coveralls.
            # kafkian

[![code-checks](https://github.com/saabeilin/kafkian/actions/workflows/check.yaml/badge.svg)](https://github.com/saabeilin/kafkian/actions/workflows/check.yaml)
[![code-checks](https://github.com/saabeilin/kafkian/actions/workflows/check.yaml/badge.svg)](https://github.com/saabeilin/kafkian/actions/workflows/check.yaml)
[![PyPI](https://img.shields.io/pypi/v/kafkian.svg)](https://pypi.org/project/kafkian/)

[//]: # ([![codecov](https://codecov.io/gh/saabeilin/kafkian/branch/master/graph/badge.svg)](https://codecov.io/gh/saabeilin/kafkian))

*kafkian* is a opinionated a high-level consumer and producer on top of
[confluent-kafka-python](https://github.com/confluentinc/confluent-kafka-python)/librdkafka
and partially inspired by [confluent_kafka_helpers](https://github.com/fyndiq/confluent_kafka_helpers).
It is intended for use primarily in CQRS/EventSourced systems when usage is mostly
limited to producing and consuming encoded messages.

*kafkian* partially mimics Kafka JAVA API, partially is more pythonic, partially just like the maintainer likes it.

Instead of configuring all the things via properties, most of the things
are planned to be configured explicitely and, wneh possible, via dependency
injection for easier testing. The configuration dictionaries for both producer
and consumer are passed-through directly to underlying confluent producer and
consumer, hidden behind a facade.

The library provides a base serializer and deserializer classes, as well as
their specialized Avro subclasses, `AvroSerializer` and `AvroDeserializer`.
This allows having, say, a plain string key and and avro-encoded message,
or vice versa. Quite often an avro-encoded string is used as a key, for
this purpose we provide `AvroStringKeySerializer`.

Unlike the Confluent library, we support supplying the specific Avro schema
together with the message, just like the Kafka JAVA API. Schemas could be
automatically registered with schema registry, also we provide three
`SubjectNameStrategy`, again compatible with Kafka JAVA API.

## Usage
### Producing messages

#### 1. Initialize the producer

```python
from kafkian import Producer
from kafkian.serde.serialization import AvroSerializer, AvroStringKeySerializer, SubjectNameStrategy

producer = Producer(
    {
        'bootstrap.servers': config.KAFKA_BOOTSTRAP_SERVERS,
    },
    key_serializer=AvroStringKeySerializer(schema_registry_url=config.SCHEMA_REGISTRY_URL),
    value_serializer=AvroSerializer(schema_registry_url=config.SCHEMA_REGISTRY_URL,
                                    subject_name_strategy=SubjectNameStrategy.RecordNameStrategy)
)

```

#### 2. Define your message schema(s)

```python
from confluent_kafka import avro
from kafkian.serde.avroserdebase import AvroRecord


value_schema_str = """
{
   "namespace": "auth.users",
   "name": "UserCreated",
   "type": "record",
   "fields" : [
     {
       "name" : "uuid",
       "type" : "string"
     },
     {
       "name" : "name",
       "type" : "string"
     },
     {
        "name": "timestamp",
        "type": {
            "type": "long",
            "logicalType": "timestamp-millis"
        }
     }
   ]
}
"""


class UserCreated(AvroRecord):
    _schema = avro.loads(value_schema_str)

```

#### 3. Produce the message

```python

producer.produce(
    "auth.users.events",
    user.uuid,
    UserCreated({
        "uuid": user.uuid,
        "name": user.name,
        "timestamp": int(user.timestamp.timestamp() * 1000)
    }),
    sync=True
)
```

### Consuming messages

#### 1. Initialize the consumer

```python
CONSUMER_CONFIG = {
    'bootstrap.servers': config.KAFKA_BOOTSTRAP_SERVERS,
    'default.topic.config': {
        'auto.offset.reset': 'latest',
    },
    'group.id': 'notifications'
}

consumer = Consumer(
    CONSUMER_CONFIG,
    topics=["auth.users.events"],
    key_deserializer=AvroDeserializer(schema_registry_url=config.SCHEMA_REGISTRY_URL),
    value_deserializer=AvroDeserializer(schema_registry_url=config.SCHEMA_REGISTRY_URL),
)
```

#### 2. Consume the messages via the generator

```python

for message in consumer:
    handle_message(message)
    consumer.commit()
```

Here, `message` is an instance of `Message` class, that wraps the original
message exposed by the confluent-kafka-python, and you can access
the decoded key and value via `.key` and `.value` properties respectively.

Notice that deserialization will happen on first access of the properties,
so you can properly handle deserialization errors (log it, send to DLQ, etc)

Both key and value are wrapped in a dynamically-generated class,
that has the full name same as the corresponding Avro schema full name.
In the example above, the value would have class named `auth.users.UserCreated`.

Avro schemas for the consumed message key and value are accessible via `.schema` property.

In addition, `topic`, `partition`, `offset`, `timestamp`, `headers` properties
are available.

## Contributing
This library is, as stated, quite opinionated, however, I'm open to suggestions.
Write your questions and suggestions as issues here on github!

#### Running tests
Both unit and system tests are provided.

To run unit-tests, install the requirements and just run
```bash
py.test tests/unit/
```

To run system tests, a Kafka cluster together with a schema registry is
required. A Docker compose file is provided, just run
```bash
docker-compose up
```
and once the cluster is up and running, run system tests via
```bash
py.test tests/system/
```


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/saabeilin/kafkian",
    "name": "kafkian",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "kafka",
    "author": "Sergei Beilin",
    "author_email": "saabeilin@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/b1/49/864e72eee9e6315530ad24fdaad9c9198e0c963680f04b6da828132224a0/kafkian-0.15.0.tar.gz",
    "platform": null,
    "description": "# kafkian\n\n[![code-checks](https://github.com/saabeilin/kafkian/actions/workflows/check.yaml/badge.svg)](https://github.com/saabeilin/kafkian/actions/workflows/check.yaml)\n[![code-checks](https://github.com/saabeilin/kafkian/actions/workflows/check.yaml/badge.svg)](https://github.com/saabeilin/kafkian/actions/workflows/check.yaml)\n[![PyPI](https://img.shields.io/pypi/v/kafkian.svg)](https://pypi.org/project/kafkian/)\n\n[//]: # ([![codecov](https://codecov.io/gh/saabeilin/kafkian/branch/master/graph/badge.svg)](https://codecov.io/gh/saabeilin/kafkian))\n\n*kafkian* is a opinionated a high-level consumer and producer on top of\n[confluent-kafka-python](https://github.com/confluentinc/confluent-kafka-python)/librdkafka\nand partially inspired by [confluent_kafka_helpers](https://github.com/fyndiq/confluent_kafka_helpers).\nIt is intended for use primarily in CQRS/EventSourced systems when usage is mostly\nlimited to producing and consuming encoded messages.\n\n*kafkian* partially mimics Kafka JAVA API, partially is more pythonic, partially just like the maintainer likes it.\n\nInstead of configuring all the things via properties, most of the things\nare planned to be configured explicitely and, wneh possible, via dependency\ninjection for easier testing. The configuration dictionaries for both producer\nand consumer are passed-through directly to underlying confluent producer and\nconsumer, hidden behind a facade.\n\nThe library provides a base serializer and deserializer classes, as well as\ntheir specialized Avro subclasses, `AvroSerializer` and `AvroDeserializer`.\nThis allows having, say, a plain string key and and avro-encoded message,\nor vice versa. Quite often an avro-encoded string is used as a key, for\nthis purpose we provide `AvroStringKeySerializer`.\n\nUnlike the Confluent library, we support supplying the specific Avro schema\ntogether with the message, just like the Kafka JAVA API. Schemas could be\nautomatically registered with schema registry, also we provide three\n`SubjectNameStrategy`, again compatible with Kafka JAVA API.\n\n## Usage\n### Producing messages\n\n#### 1. Initialize the producer\n\n```python\nfrom kafkian import Producer\nfrom kafkian.serde.serialization import AvroSerializer, AvroStringKeySerializer, SubjectNameStrategy\n\nproducer = Producer(\n    {\n        'bootstrap.servers': config.KAFKA_BOOTSTRAP_SERVERS,\n    },\n    key_serializer=AvroStringKeySerializer(schema_registry_url=config.SCHEMA_REGISTRY_URL),\n    value_serializer=AvroSerializer(schema_registry_url=config.SCHEMA_REGISTRY_URL,\n                                    subject_name_strategy=SubjectNameStrategy.RecordNameStrategy)\n)\n\n```\n\n#### 2. Define your message schema(s)\n\n```python\nfrom confluent_kafka import avro\nfrom kafkian.serde.avroserdebase import AvroRecord\n\n\nvalue_schema_str = \"\"\"\n{\n   \"namespace\": \"auth.users\",\n   \"name\": \"UserCreated\",\n   \"type\": \"record\",\n   \"fields\" : [\n     {\n       \"name\" : \"uuid\",\n       \"type\" : \"string\"\n     },\n     {\n       \"name\" : \"name\",\n       \"type\" : \"string\"\n     },\n     {\n        \"name\": \"timestamp\",\n        \"type\": {\n            \"type\": \"long\",\n            \"logicalType\": \"timestamp-millis\"\n        }\n     }\n   ]\n}\n\"\"\"\n\n\nclass UserCreated(AvroRecord):\n    _schema = avro.loads(value_schema_str)\n\n```\n\n#### 3. Produce the message\n\n```python\n\nproducer.produce(\n    \"auth.users.events\",\n    user.uuid,\n    UserCreated({\n        \"uuid\": user.uuid,\n        \"name\": user.name,\n        \"timestamp\": int(user.timestamp.timestamp() * 1000)\n    }),\n    sync=True\n)\n```\n\n### Consuming messages\n\n#### 1. Initialize the consumer\n\n```python\nCONSUMER_CONFIG = {\n    'bootstrap.servers': config.KAFKA_BOOTSTRAP_SERVERS,\n    'default.topic.config': {\n        'auto.offset.reset': 'latest',\n    },\n    'group.id': 'notifications'\n}\n\nconsumer = Consumer(\n    CONSUMER_CONFIG,\n    topics=[\"auth.users.events\"],\n    key_deserializer=AvroDeserializer(schema_registry_url=config.SCHEMA_REGISTRY_URL),\n    value_deserializer=AvroDeserializer(schema_registry_url=config.SCHEMA_REGISTRY_URL),\n)\n```\n\n#### 2. Consume the messages via the generator\n\n```python\n\nfor message in consumer:\n    handle_message(message)\n    consumer.commit()\n```\n\nHere, `message` is an instance of `Message` class, that wraps the original\nmessage exposed by the confluent-kafka-python, and you can access\nthe decoded key and value via `.key` and `.value` properties respectively.\n\nNotice that deserialization will happen on first access of the properties,\nso you can properly handle deserialization errors (log it, send to DLQ, etc)\n\nBoth key and value are wrapped in a dynamically-generated class,\nthat has the full name same as the corresponding Avro schema full name.\nIn the example above, the value would have class named `auth.users.UserCreated`.\n\nAvro schemas for the consumed message key and value are accessible via `.schema` property.\n\nIn addition, `topic`, `partition`, `offset`, `timestamp`, `headers` properties\nare available.\n\n## Contributing\nThis library is, as stated, quite opinionated, however, I'm open to suggestions.\nWrite your questions and suggestions as issues here on github!\n\n#### Running tests\nBoth unit and system tests are provided.\n\nTo run unit-tests, install the requirements and just run\n```bash\npy.test tests/unit/\n```\n\nTo run system tests, a Kafka cluster together with a schema registry is\nrequired. A Docker compose file is provided, just run\n```bash\ndocker-compose up\n```\nand once the cluster is up and running, run system tests via\n```bash\npy.test tests/system/\n```\n\n",
    "bugtrack_url": null,
    "license": "Apache 2.0",
    "summary": "Opinionated Kafka Python client on top of Confluent python library",
    "version": "0.15.0",
    "project_urls": {
        "Homepage": "https://github.com/saabeilin/kafkian"
    },
    "split_keywords": [
        "kafka"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1bc08a537fca68d1d8c8dc8daed781a298fa340a1c1b3d0ee54c7c7d16cd7ed6",
                "md5": "770dd120d087f9dc05fff1f65c3bc3ee",
                "sha256": "59009a877987f5084ac8484be92bb5bef98d5c1dd2ffbbbf252c6add19164e59"
            },
            "downloads": -1,
            "filename": "kafkian-0.15.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "770dd120d087f9dc05fff1f65c3bc3ee",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 26586,
            "upload_time": "2024-02-09T15:20:08",
            "upload_time_iso_8601": "2024-02-09T15:20:08.331622Z",
            "url": "https://files.pythonhosted.org/packages/1b/c0/8a537fca68d1d8c8dc8daed781a298fa340a1c1b3d0ee54c7c7d16cd7ed6/kafkian-0.15.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b149864e72eee9e6315530ad24fdaad9c9198e0c963680f04b6da828132224a0",
                "md5": "9f129208b5faf0bc52be51ef960ec3d5",
                "sha256": "6dbcb5e308337db8c31e49db7cc0c0dd32e3034f87e42423d62b53243b85ede1"
            },
            "downloads": -1,
            "filename": "kafkian-0.15.0.tar.gz",
            "has_sig": false,
            "md5_digest": "9f129208b5faf0bc52be51ef960ec3d5",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 19615,
            "upload_time": "2024-02-09T15:20:10",
            "upload_time_iso_8601": "2024-02-09T15:20:10.395042Z",
            "url": "https://files.pythonhosted.org/packages/b1/49/864e72eee9e6315530ad24fdaad9c9198e0c963680f04b6da828132224a0/kafkian-0.15.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-09 15:20:10",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "saabeilin",
    "github_project": "kafkian",
    "travis_ci": true,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "tox": true,
    "lcname": "kafkian"
}
        
Elapsed time: 0.17104s