# Kstreams
`kstreams` is a library/micro framework to use with `kafka`. It has simple kafka streams implementation that gives certain guarantees, see below.
![Build status](https://github.com/kpn/kstreams/actions/workflows/pr-tests.yaml/badge.svg?branch=master)
[![codecov](https://codecov.io/gh/kpn/kstreams/branch/master/graph/badge.svg?token=t7pxIPtphF)](https://codecov.io/gh/kpn/kstreams)
![python version](https://img.shields.io/badge/python-3.8%2B-yellowgreen)
---
**Documentation**: https://kpn.github.io/kstreams/
---
## Installation
```bash
pip install kstreams
```
You will need a worker, we recommend [aiorun](https://github.com/cjrh/aiorun)
```bash
pip install aiorun
```
## Usage
```python
import aiorun
from kstreams import create_engine, ConsumerRecord
stream_engine = create_engine(title="my-stream-engine")
@stream_engine.stream("local--kstream")
async def consume(cr: ConsumerRecord):
print(f"Event consumed: headers: {cr.headers}, payload: {cr.value}")
async def produce():
payload = b'{"message": "Hello world!"}'
for i in range(5):
metadata = await stream_engine.send("local--kstreams", value=payload)
print(f"Message sent: {metadata}")
async def start():
await stream_engine.start()
await produce()
async def shutdown(loop):
await stream_engine.stop()
if __name__ == "__main__":
aiorun.run(start(), stop_on_unhandled_errors=True, shutdown_callback=shutdown)
```
## Features
- [x] Produce events
- [x] Consumer events with `Streams`
- [x] Subscribe to topics by `pattern`
- [x] `Prometheus` metrics and custom monitoring
- [x] TestClient
- [x] Custom Serialization and Deserialization
- [x] Easy to integrate with any `async` framework. No tied to any library!!
- [x] Yield events from streams
- [x] [Opentelemetry Instrumentation](https://github.com/kpn/opentelemetry-instrumentation-kstreams)
- [x] Middlewares
- [x] Hooks (on_startup, on_stop, after_startup, after_stop)
- [ ] Store (kafka streams pattern)
- [ ] Stream Join
- [ ] Windowing
## Development
This repo requires the use of [poetry](https://python-poetry.org/docs/basic-usage/) instead of pip.
*Note*: If you want to have the `virtualenv` in the same path as the project first you should run `poetry config --local virtualenvs.in-project true`
To install the dependencies just execute:
```bash
poetry install
```
Then you can activate the `virtualenv` with
```bash
poetry shell
```
Run test:
```bash
./scripts/test
```
Run code formatting with ruff:
```bash
./scripts/format
```
### Commit messages
We use [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/) for the commit message.
The use of [commitizen](https://commitizen-tools.github.io/commitizen/) is recommended. Commitizen is part of the dev dependencies.
```bash
cz commit
```
Raw data
{
"_id": null,
"home_page": null,
"name": "kstreams",
"maintainer": "Santiago Fraire Willemo\u00ebs",
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": "santiago.fraire@kpn.com",
"keywords": "stream, processing, kafka, event streaming",
"author": "Marcos Schroh",
"author_email": "marcos.schroh@kpn.com",
"download_url": "https://files.pythonhosted.org/packages/e8/cb/a99784f15f38082b37e0b27405f6377c3f6e69059e6f03441a5d2008778c/kstreams-0.26.4.tar.gz",
"platform": null,
"description": "# Kstreams\n\n`kstreams` is a library/micro framework to use with `kafka`. It has simple kafka streams implementation that gives certain guarantees, see below.\n\n![Build status](https://github.com/kpn/kstreams/actions/workflows/pr-tests.yaml/badge.svg?branch=master)\n[![codecov](https://codecov.io/gh/kpn/kstreams/branch/master/graph/badge.svg?token=t7pxIPtphF)](https://codecov.io/gh/kpn/kstreams)\n![python version](https://img.shields.io/badge/python-3.8%2B-yellowgreen)\n\n---\n\n**Documentation**: https://kpn.github.io/kstreams/\n\n---\n\n## Installation\n\n```bash\npip install kstreams\n```\n\nYou will need a worker, we recommend [aiorun](https://github.com/cjrh/aiorun)\n\n```bash\npip install aiorun\n```\n\n## Usage\n\n```python\nimport aiorun\nfrom kstreams import create_engine, ConsumerRecord\n\n\nstream_engine = create_engine(title=\"my-stream-engine\")\n\n@stream_engine.stream(\"local--kstream\")\nasync def consume(cr: ConsumerRecord):\n print(f\"Event consumed: headers: {cr.headers}, payload: {cr.value}\")\n\n\nasync def produce():\n payload = b'{\"message\": \"Hello world!\"}'\n\n for i in range(5):\n metadata = await stream_engine.send(\"local--kstreams\", value=payload)\n print(f\"Message sent: {metadata}\")\n\n\nasync def start():\n await stream_engine.start()\n await produce()\n\n\nasync def shutdown(loop):\n await stream_engine.stop()\n\n\nif __name__ == \"__main__\":\n aiorun.run(start(), stop_on_unhandled_errors=True, shutdown_callback=shutdown)\n```\n\n## Features\n\n- [x] Produce events\n- [x] Consumer events with `Streams`\n- [x] Subscribe to topics by `pattern`\n- [x] `Prometheus` metrics and custom monitoring\n- [x] TestClient\n- [x] Custom Serialization and Deserialization\n- [x] Easy to integrate with any `async` framework. No tied to any library!!\n- [x] Yield events from streams\n- [x] [Opentelemetry Instrumentation](https://github.com/kpn/opentelemetry-instrumentation-kstreams)\n- [x] Middlewares\n- [x] Hooks (on_startup, on_stop, after_startup, after_stop)\n- [ ] Store (kafka streams pattern)\n- [ ] Stream Join\n- [ ] Windowing\n\n## Development\n\nThis repo requires the use of [poetry](https://python-poetry.org/docs/basic-usage/) instead of pip.\n*Note*: If you want to have the `virtualenv` in the same path as the project first you should run `poetry config --local virtualenvs.in-project true`\n\nTo install the dependencies just execute:\n\n```bash\npoetry install\n```\n\nThen you can activate the `virtualenv` with\n\n```bash\npoetry shell\n```\n\nRun test:\n\n```bash\n./scripts/test\n```\n\nRun code formatting with ruff:\n\n```bash\n./scripts/format\n```\n\n### Commit messages\n\nWe use [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/) for the commit message.\n\nThe use of [commitizen](https://commitizen-tools.github.io/commitizen/) is recommended. Commitizen is part of the dev dependencies.\n\n```bash\ncz commit\n```\n\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Build simple kafka streams applications",
"version": "0.26.4",
"project_urls": null,
"split_keywords": [
"stream",
" processing",
" kafka",
" event streaming"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "211afaf826e3c5755222f2ae8ffa0393b7e8176e269e46a37257f42fd1849f92",
"md5": "3b6c36fdb3468bf220bcd6076cbaa514",
"sha256": "28be521d69cbda1902a237d5849dd425fbaef6cf0206ce3150e2205bab91205c"
},
"downloads": -1,
"filename": "kstreams-0.26.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "3b6c36fdb3468bf220bcd6076cbaa514",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 37265,
"upload_time": "2025-01-06T13:43:44",
"upload_time_iso_8601": "2025-01-06T13:43:44.074849Z",
"url": "https://files.pythonhosted.org/packages/21/1a/faf826e3c5755222f2ae8ffa0393b7e8176e269e46a37257f42fd1849f92/kstreams-0.26.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e8cba99784f15f38082b37e0b27405f6377c3f6e69059e6f03441a5d2008778c",
"md5": "730bd3ace3eab70de68e8261cf9d7125",
"sha256": "5286a2d8a541693df89a18c7b6d1a2b939fed8a48ebeb25664ae6730e04c7502"
},
"downloads": -1,
"filename": "kstreams-0.26.4.tar.gz",
"has_sig": false,
"md5_digest": "730bd3ace3eab70de68e8261cf9d7125",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 29467,
"upload_time": "2025-01-06T13:43:45",
"upload_time_iso_8601": "2025-01-06T13:43:45.345211Z",
"url": "https://files.pythonhosted.org/packages/e8/cb/a99784f15f38082b37e0b27405f6377c3f6e69059e6f03441a5d2008778c/kstreams-0.26.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-01-06 13:43:45",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "kstreams"
}