Name | rtdi-kafkaavro JSON |
Version |
0.10.20
JSON |
| download |
home_page | None |
Summary | Kafka schema support for data integration |
upload_time | 2025-09-07 16:34:51 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | None |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Kafka Avro convenience methods
When using Avro messages in the context of Kafka as Data Integration solution, there are a couple of common problems.
All of them are easy to solve and this library aims to help.
## Common problems in the Kafka-Avro world
### No detailed data type information
**Problem**: Avro supports base data types like String, Integer and the such plus a few logical data types like TimeMillis. But if data is moved from one database to another database via Kafka, what should the target database use as a data type?
**Example**: Source database has a NVARCHAR(10). In the Avro Schema that must be a String. If the table in the target is created based on the Avro Schema, the only proper data type for this column is NCLOB. With all the performance penalty involved with LOB datatypes.
**Solution**: Provide more Logical Data Types for Avro. Avro allows adding custom data types and they just annotate the schema. Zero impact to any existing code.
### Schema and Field names are very restricted in Avro
**Problem**: A database has a column called "/BIC/MANDT000". This is not a valid Avro field name.
**Solution**: Name converters to turn names into Avro supported names and vice versa. The change should be as little as possible.
### Add more metadata to the schema
**Problem**: Initially in the project only the data is put into the messages. Once the solution is in production users will ask questions like
* What are the primary keys of this record?
* Where did the record originate from? This topic contains records from multiple source systems.
* Why was the record sent by the source source system? Who triggered it, what is the location within the source (e.g. rowid)?
* Which transformations did the record undergo and what was the success of the transformation?
**Solution**: Add optional fields to the schema to provide space for this kind of information.
### How is the record to be treated? Insert, delete,..?
**Problem**: In a Data Integration scenario data is not only appended but also deleted, updated, etc. How does the consumer know? The producer must provide that information.
**Example**: In the source database a delete of customer=1234 happened. The expectation is that this record flows through Kafka and a database consumer does delete the record from the target database.
**Solution**: Have a naming convention on how a produce provides this information in the record.
### A Schema Builder better suited for dynamic creation
**Problem**: The Avro SchemaBuilder is well suited for fixed, well known schemas.
**Example**: A schema should be built based on a database table metadata, including primary key information, nullable etc.
**Solution**: An alternative SchemaBuilder better suited for such recursive calls.
### Schemas must support an extension concept
**Problem**: Often multiple producers create data for the same schema. Hence the schema must be the superset of all. While that does make sense for most fields, some are more of technical nature and do not deserve an individual field just for itself.
**Example**: The source system has a gender column of type string, the Avro schema a gender as integer. The producer does convert each string to the official value but it would be nice if the original value is stored in the record as well somewhere for auditing purposes.
**Solution**: Each schema level has an __extension array to store key-value pairs.
Raw data
{
"_id": null,
"home_page": null,
"name": "rtdi-kafkaavro",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": "Werner D\u00e4hn <werner.daehn@rtdi.io>",
"download_url": "https://files.pythonhosted.org/packages/ed/25/e6e0c081e233e957017b11bdcea0ec3b1d13abb683a45dbe2dd5b99d5cd6/rtdi_kafkaavro-0.10.20.tar.gz",
"platform": null,
"description": "# Kafka Avro convenience methods\n\nWhen using Avro messages in the context of Kafka as Data Integration solution, there are a couple of common problems.\nAll of them are easy to solve and this library aims to help.\n\n## Common problems in the Kafka-Avro world\n\n### No detailed data type information\n\n**Problem**: Avro supports base data types like String, Integer and the such plus a few logical data types like TimeMillis. But if data is moved from one database to another database via Kafka, what should the target database use as a data type?\n\n**Example**: Source database has a NVARCHAR(10). In the Avro Schema that must be a String. If the table in the target is created based on the Avro Schema, the only proper data type for this column is NCLOB. With all the performance penalty involved with LOB datatypes.\n\n**Solution**: Provide more Logical Data Types for Avro. Avro allows adding custom data types and they just annotate the schema. Zero impact to any existing code.\n\n\n### Schema and Field names are very restricted in Avro\n\n**Problem**: A database has a column called \"/BIC/MANDT000\". This is not a valid Avro field name.\n\n**Solution**: Name converters to turn names into Avro supported names and vice versa. The change should be as little as possible.\n\n\n### Add more metadata to the schema\n\n**Problem**: Initially in the project only the data is put into the messages. Once the solution is in production users will ask questions like\n\n* What are the primary keys of this record?\n* Where did the record originate from? This topic contains records from multiple source systems.\n* Why was the record sent by the source source system? Who triggered it, what is the location within the source (e.g. rowid)?\n* Which transformations did the record undergo and what was the success of the transformation?\n\n**Solution**: Add optional fields to the schema to provide space for this kind of information.\n\n\n### How is the record to be treated? Insert, delete,..?\n\n**Problem**: In a Data Integration scenario data is not only appended but also deleted, updated, etc. How does the consumer know? The producer must provide that information.\n\n**Example**: In the source database a delete of customer=1234 happened. The expectation is that this record flows through Kafka and a database consumer does delete the record from the target database.\n\n**Solution**: Have a naming convention on how a produce provides this information in the record.\n\n\n### A Schema Builder better suited for dynamic creation\n\n**Problem**: The Avro SchemaBuilder is well suited for fixed, well known schemas.\n\n**Example**: A schema should be built based on a database table metadata, including primary key information, nullable etc.\n\n**Solution**: An alternative SchemaBuilder better suited for such recursive calls.\n\n\n### Schemas must support an extension concept\n\n**Problem**: Often multiple producers create data for the same schema. Hence the schema must be the superset of all. While that does make sense for most fields, some are more of technical nature and do not deserve an individual field just for itself.\n\n**Example**: The source system has a gender column of type string, the Avro schema a gender as integer. The producer does convert each string to the official value but it would be nice if the original value is stored in the record as well somewhere for auditing purposes.\n\n**Solution**: Each schema level has an __extension array to store key-value pairs.\n\n",
"bugtrack_url": null,
"license": null,
"summary": "Kafka schema support for data integration",
"version": "0.10.20",
"project_urls": {
"Homepage": "https://github.com/rtdi/KafkaAvro",
"Issues": "https://github.com/rtdi/KafkaAvro/issues"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "b561f5136fddac2e2ff375e6a36c387696cdbde7ee21c296e569dd50bcd59107",
"md5": "1948922985278f1a572d324b790d10e9",
"sha256": "42df941bf674450d627d7c997e59c25216a8b731d38f7622a02c951a5ea01ebd"
},
"downloads": -1,
"filename": "rtdi_kafkaavro-0.10.20-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1948922985278f1a572d324b790d10e9",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 16479,
"upload_time": "2025-09-07T16:34:49",
"upload_time_iso_8601": "2025-09-07T16:34:49.613297Z",
"url": "https://files.pythonhosted.org/packages/b5/61/f5136fddac2e2ff375e6a36c387696cdbde7ee21c296e569dd50bcd59107/rtdi_kafkaavro-0.10.20-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "ed25e6e0c081e233e957017b11bdcea0ec3b1d13abb683a45dbe2dd5b99d5cd6",
"md5": "80a2f4de7d1ebb70f627734643f2be18",
"sha256": "1d56a0596c6446ae365f4208e6e60e86a52fe7529bced726075132730cc3d943"
},
"downloads": -1,
"filename": "rtdi_kafkaavro-0.10.20.tar.gz",
"has_sig": false,
"md5_digest": "80a2f4de7d1ebb70f627734643f2be18",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 18614,
"upload_time": "2025-09-07T16:34:51",
"upload_time_iso_8601": "2025-09-07T16:34:51.003373Z",
"url": "https://files.pythonhosted.org/packages/ed/25/e6e0c081e233e957017b11bdcea0ec3b1d13abb683a45dbe2dd5b99d5cd6/rtdi_kafkaavro-0.10.20.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-07 16:34:51",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "rtdi",
"github_project": "KafkaAvro",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "rtdi-kafkaavro"
}