# Kstreams
`kstreams` is a library/micro framework to use with `kafka`. It has simple kafka streams implementation that gives certain guarantees, see below.
![Build status](https://github.com/kpn/kstreams/actions/workflows/pr-tests.yaml/badge.svg?branch=master)
[![codecov](https://codecov.io/gh/kpn/kstreams/branch/master/graph/badge.svg?token=t7pxIPtphF)](https://codecov.io/gh/kpn/kstreams)
![python version](https://img.shields.io/badge/python-3.8%2B-yellowgreen)
---
**Documentation**: https://kpn.github.io/kstreams/
---
## Installation
```bash
pip install kstreams
```
You will need a worker, we recommend [aiorun](https://github.com/cjrh/aiorun)
```bash
pip install aiorun
```
## Usage
```python
import aiorun
from kstreams import create_engine, ConsumerRecord
stream_engine = create_engine(title="my-stream-engine")
@stream_engine.stream("local--kstream")
async def consume(cr: ConsumerRecord):
print(f"Event consumed: headers: {cr.headers}, payload: {cr.value}")
async def produce():
payload = b'{"message": "Hello world!"}'
for i in range(5):
metadata = await stream_engine.send("local--kstreams", value=payload)
print(f"Message sent: {metadata}")
async def start():
await stream_engine.start()
await produce()
async def shutdown(loop):
await stream_engine.stop()
if __name__ == "__main__":
aiorun.run(start(), stop_on_unhandled_errors=True, shutdown_callback=shutdown)
```
## Features
- [x] Produce events
- [x] Consumer events with `Streams`
- [x] Subscribe to topics by `pattern`
- [x] `Prometheus` metrics and custom monitoring
- [x] TestClient
- [x] Custom Serialization and Deserialization
- [x] Easy to integrate with any `async` framework. No tied to any library!!
- [x] Yield events from streams
- [x] [Opentelemetry Instrumentation](https://github.com/kpn/opentelemetry-instrumentation-kstreams)
- [x] Middlewares
- [x] Hooks (on_startup, on_stop, after_startup, after_stop)
- [ ] Store (kafka streams pattern)
- [ ] Stream Join
- [ ] Windowing
## Development
This repo requires the use of [poetry](https://python-poetry.org/docs/basic-usage/) instead of pip.
*Note*: If you want to have the `virtualenv` in the same path as the project first you should run `poetry config --local virtualenvs.in-project true`
To install the dependencies just execute:
```bash
poetry install
```
Then you can activate the `virtualenv` with
```bash
poetry shell
```
Run test:
```bash
./scripts/test
```
Run code formatting with ruff:
```bash
./scripts/format
```
### Commit messages
We use [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/) for the commit message.
The use of [commitizen](https://commitizen-tools.github.io/commitizen/) is recommended. Commitizen is part of the dev dependencies.
```bash
cz commit
```
Raw data
{
"_id": null,
"home_page": null,
"name": "kstreams",
"maintainer": "Santiago Fraire Willemo\u00ebs",
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": "santiago.fraire@kpn.com",
"keywords": "stream, processing, kafka, event streaming",
"author": "Marcos Schroh",
"author_email": "marcos.schroh@kpn.com",
"download_url": "https://files.pythonhosted.org/packages/8e/ff/1ec2637f5add22265f661400238584463bd7a4b2edfa984eb4b1f1ac8375/kstreams-0.26.3.tar.gz",
"platform": null,
"description": "# Kstreams\n\n`kstreams` is a library/micro framework to use with `kafka`. It has simple kafka streams implementation that gives certain guarantees, see below.\n\n![Build status](https://github.com/kpn/kstreams/actions/workflows/pr-tests.yaml/badge.svg?branch=master)\n[![codecov](https://codecov.io/gh/kpn/kstreams/branch/master/graph/badge.svg?token=t7pxIPtphF)](https://codecov.io/gh/kpn/kstreams)\n![python version](https://img.shields.io/badge/python-3.8%2B-yellowgreen)\n\n---\n\n**Documentation**: https://kpn.github.io/kstreams/\n\n---\n\n## Installation\n\n```bash\npip install kstreams\n```\n\nYou will need a worker, we recommend [aiorun](https://github.com/cjrh/aiorun)\n\n```bash\npip install aiorun\n```\n\n## Usage\n\n```python\nimport aiorun\nfrom kstreams import create_engine, ConsumerRecord\n\n\nstream_engine = create_engine(title=\"my-stream-engine\")\n\n@stream_engine.stream(\"local--kstream\")\nasync def consume(cr: ConsumerRecord):\n print(f\"Event consumed: headers: {cr.headers}, payload: {cr.value}\")\n\n\nasync def produce():\n payload = b'{\"message\": \"Hello world!\"}'\n\n for i in range(5):\n metadata = await stream_engine.send(\"local--kstreams\", value=payload)\n print(f\"Message sent: {metadata}\")\n\n\nasync def start():\n await stream_engine.start()\n await produce()\n\n\nasync def shutdown(loop):\n await stream_engine.stop()\n\n\nif __name__ == \"__main__\":\n aiorun.run(start(), stop_on_unhandled_errors=True, shutdown_callback=shutdown)\n```\n\n## Features\n\n- [x] Produce events\n- [x] Consumer events with `Streams`\n- [x] Subscribe to topics by `pattern`\n- [x] `Prometheus` metrics and custom monitoring\n- [x] TestClient\n- [x] Custom Serialization and Deserialization\n- [x] Easy to integrate with any `async` framework. No tied to any library!!\n- [x] Yield events from streams\n- [x] [Opentelemetry Instrumentation](https://github.com/kpn/opentelemetry-instrumentation-kstreams)\n- [x] Middlewares\n- [x] Hooks (on_startup, on_stop, after_startup, after_stop)\n- [ ] Store (kafka streams pattern)\n- [ ] Stream Join\n- [ ] Windowing\n\n## Development\n\nThis repo requires the use of [poetry](https://python-poetry.org/docs/basic-usage/) instead of pip.\n*Note*: If you want to have the `virtualenv` in the same path as the project first you should run `poetry config --local virtualenvs.in-project true`\n\nTo install the dependencies just execute:\n\n```bash\npoetry install\n```\n\nThen you can activate the `virtualenv` with\n\n```bash\npoetry shell\n```\n\nRun test:\n\n```bash\n./scripts/test\n```\n\nRun code formatting with ruff:\n\n```bash\n./scripts/format\n```\n\n### Commit messages\n\nWe use [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/) for the commit message.\n\nThe use of [commitizen](https://commitizen-tools.github.io/commitizen/) is recommended. Commitizen is part of the dev dependencies.\n\n```bash\ncz commit\n```\n\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Build simple kafka streams applications",
"version": "0.26.3",
"project_urls": null,
"split_keywords": [
"stream",
" processing",
" kafka",
" event streaming"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "bdcb6ff1b33408068ff7d07f969725ffb28834bd696d86202fcc3fff00bf2322",
"md5": "3266cc05f8eaa015fda4ab6c45120650",
"sha256": "b655676288700b511744b9c2249039a4176685e1a43021614eea865d300191c3"
},
"downloads": -1,
"filename": "kstreams-0.26.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "3266cc05f8eaa015fda4ab6c45120650",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 37267,
"upload_time": "2024-11-27T15:40:16",
"upload_time_iso_8601": "2024-11-27T15:40:16.471613Z",
"url": "https://files.pythonhosted.org/packages/bd/cb/6ff1b33408068ff7d07f969725ffb28834bd696d86202fcc3fff00bf2322/kstreams-0.26.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "8eff1ec2637f5add22265f661400238584463bd7a4b2edfa984eb4b1f1ac8375",
"md5": "bb579d792f04ce1815afa8b210a7e544",
"sha256": "54c7d80fc0f55d98d6cd4a5da3664a649c12eadf5612dcc8ba0c2b29b83ff906"
},
"downloads": -1,
"filename": "kstreams-0.26.3.tar.gz",
"has_sig": false,
"md5_digest": "bb579d792f04ce1815afa8b210a7e544",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 30313,
"upload_time": "2024-11-27T15:40:18",
"upload_time_iso_8601": "2024-11-27T15:40:18.637813Z",
"url": "https://files.pythonhosted.org/packages/8e/ff/1ec2637f5add22265f661400238584463bd7a4b2edfa984eb4b1f1ac8375/kstreams-0.26.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-27 15:40:18",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "kstreams"
}