# faust-avro-serializer
This repo contains an improved version of the avro serializer from
https://github.com/marcosschroh/python-schema-registry-client/. It expects the schema
to be stored in the record itself in order to mimic the behavior of Confluent's Avro SerDe.
It uses Faust's [metadata](https://faust.readthedocs.io/en/latest/userguide/models.html#polymorphic-fields) capability inside the ``Record`` class to read the Avro schema
dynamically.
### Example
```python
from faust import Record, Schema, Stream
from faust_avro_serializer import FaustAvroSerializer
from schema_registry.client import SchemaRegistryClient
import faust
app = faust.App('myapp', broker='kafka://localhost')
my_topic_name = "my-dummy-topic"
class MyRecordExample(Record):
_schema = {
"type": "record",
"namespace": "com.example",
"name": "MyRecordExample",
"fields": [
{ "name": "foo", "type": "string" },
{ "name": "bar", "type": "string" }
]
}
foo: str
bar: str
client = SchemaRegistryClient("http://my-schema-registry:8081")
serializer = FaustAvroSerializer(client, my_topic_name, False)
schema_with_avro = Schema(key_serializer=str, value_serializer=serializer)
dummy_topic = app.topic(my_topic_name, schema=schema_with_avro)
@app.agents(dummy_topic)
async def my_agent(myrecord: Stream[MyRecordExample]):
async for record in myrecord:
print(record.to_representation())
```
When the serializer calls the ``_dumps`` method, it searches for the ``__faust`` field inside the
record.
If the serializer finds the field, it is resolving the class and reads the ``_schema`` field
containing the Avro schema.
Raw data
{
"_id": null,
"home_page": "https://github.com/bakdata/faust-avro-serializer",
"name": "faust-avro-serializer",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": null,
"author": "bakdata",
"author_email": "info@bakdata.com",
"download_url": "https://files.pythonhosted.org/packages/22/6e/8fc5d5f74006a4a936ac04ebc2d8ccee5e62d82b267dc5aaa7cc75638eb6/faust_avro_serializer-1.2.0.tar.gz",
"platform": null,
"description": "# faust-avro-serializer\n\nThis repo contains an improved version of the avro serializer from\nhttps://github.com/marcosschroh/python-schema-registry-client/. It expects the schema\nto be stored in the record itself in order to mimic the behavior of Confluent's Avro SerDe.\nIt uses Faust's [metadata](https://faust.readthedocs.io/en/latest/userguide/models.html#polymorphic-fields) capability inside the ``Record`` class to read the Avro schema \ndynamically.\n### Example\n\n```python\nfrom faust import Record, Schema, Stream\nfrom faust_avro_serializer import FaustAvroSerializer\nfrom schema_registry.client import SchemaRegistryClient\nimport faust\n\napp = faust.App('myapp', broker='kafka://localhost')\nmy_topic_name = \"my-dummy-topic\"\n\nclass MyRecordExample(Record):\n _schema = {\n \"type\": \"record\",\n \"namespace\": \"com.example\",\n \"name\": \"MyRecordExample\",\n \"fields\": [\n { \"name\": \"foo\", \"type\": \"string\" },\n { \"name\": \"bar\", \"type\": \"string\" }\n ]\n} \n foo: str\n bar: str\n\nclient = SchemaRegistryClient(\"http://my-schema-registry:8081\")\nserializer = FaustAvroSerializer(client, my_topic_name, False)\n\nschema_with_avro = Schema(key_serializer=str, value_serializer=serializer)\n\ndummy_topic = app.topic(my_topic_name, schema=schema_with_avro)\n\n@app.agents(dummy_topic)\nasync def my_agent(myrecord: Stream[MyRecordExample]):\n async for record in myrecord:\n print(record.to_representation())\n\n``` \n\nWhen the serializer calls the ``_dumps`` method, it searches for the ``__faust`` field inside the\nrecord. \nIf the serializer finds the field, it is resolving the class and reads the ``_schema`` field \ncontaining the Avro schema.\n\n",
"bugtrack_url": null,
"license": null,
"summary": "Faust serializer that serializes Avro compatible messages",
"version": "1.2.0",
"project_urls": {
"Homepage": "https://github.com/bakdata/faust-avro-serializer"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "6f786a2a1762c240c00a86c3d737da387fe4d21230c49e2a502045b7cc3499d7",
"md5": "ff15ec7809faa6569363b432ea218e17",
"sha256": "cdb22b305cce3d8dabcd5686c62b4f3e5f08e863b9e0d7a0ca9ed532b84b5578"
},
"downloads": -1,
"filename": "faust_avro_serializer-1.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ff15ec7809faa6569363b432ea218e17",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 4538,
"upload_time": "2023-09-12T11:26:43",
"upload_time_iso_8601": "2023-09-12T11:26:43.584974Z",
"url": "https://files.pythonhosted.org/packages/6f/78/6a2a1762c240c00a86c3d737da387fe4d21230c49e2a502045b7cc3499d7/faust_avro_serializer-1.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "226e8fc5d5f74006a4a936ac04ebc2d8ccee5e62d82b267dc5aaa7cc75638eb6",
"md5": "c1d9796511eeb53a12c9067bff366698",
"sha256": "0305bc6882c489b90c2651ec2363603a04c2e6c3a8f9021aca386538444d6d12"
},
"downloads": -1,
"filename": "faust_avro_serializer-1.2.0.tar.gz",
"has_sig": false,
"md5_digest": "c1d9796511eeb53a12c9067bff366698",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 5888,
"upload_time": "2023-09-12T11:26:45",
"upload_time_iso_8601": "2023-09-12T11:26:45.110471Z",
"url": "https://files.pythonhosted.org/packages/22/6e/8fc5d5f74006a4a936ac04ebc2d8ccee5e62d82b267dc5aaa7cc75638eb6/faust_avro_serializer-1.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-09-12 11:26:45",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "bakdata",
"github_project": "faust-avro-serializer",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "faust-avro-serializer"
}