# Loomr — Modular Messaging Service
Loomr is a modular, event-driven messaging runtime for building bots and automations across channels. Start with Telegram today; add more platforms via adapters and plugins. Configure flows in YAML, integrate via HTTP or SDKs, and run anywhere (Docker, CLI, or code).
## Why Loomr
- **Multi-channel by design**: Start with Telegram; add more via adapters.
- **Modular**: Drop-in plugins for features (admin tools, analytics, menus, products, support, ticker, etc.).
- **Flows in YAML**: Describe steps, prompts, validations, branches without code.
- **Event-driven**: Built-in bus to trigger HTTP or shell actions on events (payments, roles, deliveries).
- **First-class API**: FastAPI server with OpenAPI/Swagger for delivery and integrations.
- **Ship fast**: Use Docker one-liners or the CLI, customize when needed.
## Quickstart
[](Dockerfile)
- Docker (API):
```bash
docker build -t loomr/messaging .
docker run --rm -p 8090:8090 -e MODE=api loomr/messaging
# Open http://127.0.0.1:8090/docs
```
- Docker (Bot):
```bash
docker run --rm --env-file messaging_service/.env -e MODE=bot loomr/messaging
```
- CLI (local):
```bash
python3 -m venv messaging_service/.venv
messaging_service/.venv/bin/pip install -r messaging_service/requirements.txt
messaging_service/.venv/bin/python -m messaging_service.cli init
messaging_service/.venv/bin/python -m messaging_service.cli run-api
# or
messaging_service/.venv/bin/python -m messaging_service.cli run-bot
```
## Local LLM via Ollama (optional)
Run Loomr with a local model through [Ollama](https://ollama.com/). The built-in plugin `ollama_assistant` exposes two commands:
- `/ask <question>` — ask the local model a question.
- `/coach <user reply>` — evaluate if a user's reply answers the current question and get a clearer re-ask.
### Setup
1) Install and pull a model
```bash
curl -fsSL https://ollama.com/install.sh | sh
ollama pull llama3:8b
curl -sS http://127.0.0.1:11434/api/tags
```
2) Environment
Copy `.env.example` to `.env` and set at least your Telegram token. Optional Ollama vars:
```env
OLLAMA_HOST=http://127.0.0.1:11434
OLLAMA_MODEL=llama3:8b
QUESTIONARY_PATH=messaging_service/config/questionary.md
```
3) Enable plugin
`messaging_service/config/config.yaml` already includes `ollama_assistant` under `plugins.enabled` and an `ollama:` section.
4) Run
```bash
make venv && make install
make run-bot
```
### Usage examples
DM or Group (with BotFather privacy ON, the bot only sees commands):
```text
/ask What plugins are enabled?
/ask Summarize the last 5 messages.
/coach Why should I choice ?
```
Group privacy OFF allows the bot to receive normal messages, but by default this plugin only responds to `/ask` and `/coach`.
## API overview
FastAPI app in `messaging_service/api_server.py`.
- Docs: `GET /docs`, `GET /redoc`
- Product delivery: `POST /deliver`
- TON verify (example): `POST /ton/verify`
- Group upgrade (example): `POST /group/upgrade`
Auth: set `DELIVER_BEARER` in `.env` and include `Authorization: Bearer <token>` for `/deliver`.
## Roadmap
- Additional adapters (WhatsApp/Instagram/etc.)
- SDKs from OpenAPI (TypeScript + Python)
- More built-in plugins and flow blocks
- Docker image publish on tags (GHCR/Docker Hub)
## License
Source-available and free for non‑commercial use under the Prosperity Public License 3.0.0.
- See: `LICENSE` (Prosperity-3.0.0)
- Commercial licensing: `COMMERCIAL_LICENSE.md`
- Third‑party notices: `THIRD-PARTY-NOTICES.md`
## Contributing
See `CONTRIBUTING.md`. PRs and plugins welcome!
## Community & Support
- Telegram Support Group: https://t.me/+i1RDBKJv0U01OTQ0
- Author: Kai Gartner — LinkedIn: https://linkedin.com/in/kaigartner — Instagram: https://instagram.com/kaigartner
Raw data
{
"_id": null,
"home_page": null,
"name": "Loomr",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "telegram, bot, fastapi, plugins, event-bus, yaml, sdk",
"author": null,
"author_email": "Kai Gartner <hello@example.com>",
"download_url": "https://files.pythonhosted.org/packages/e3/ca/7475e5fc5bcbe99242ed575688670d925b4a2db88b1ae8ce0f88c56f3cf4/loomr-0.1.5.tar.gz",
"platform": null,
"description": "# Loomr \u2014 Modular Messaging Service\nLoomr is a modular, event-driven messaging runtime for building bots and automations across channels. Start with Telegram today; add more platforms via adapters and plugins. Configure flows in YAML, integrate via HTTP or SDKs, and run anywhere (Docker, CLI, or code).\n\n## Why Loomr\n\n- **Multi-channel by design**: Start with Telegram; add more via adapters.\n- **Modular**: Drop-in plugins for features (admin tools, analytics, menus, products, support, ticker, etc.).\n- **Flows in YAML**: Describe steps, prompts, validations, branches without code.\n- **Event-driven**: Built-in bus to trigger HTTP or shell actions on events (payments, roles, deliveries).\n- **First-class API**: FastAPI server with OpenAPI/Swagger for delivery and integrations.\n- **Ship fast**: Use Docker one-liners or the CLI, customize when needed.\n\n## Quickstart\n[](Dockerfile)\n\n- Docker (API):\n ```bash\n docker build -t loomr/messaging .\n docker run --rm -p 8090:8090 -e MODE=api loomr/messaging\n # Open http://127.0.0.1:8090/docs\n ```\n\n- Docker (Bot):\n ```bash\n docker run --rm --env-file messaging_service/.env -e MODE=bot loomr/messaging\n ```\n\n- CLI (local):\n ```bash\n python3 -m venv messaging_service/.venv\n messaging_service/.venv/bin/pip install -r messaging_service/requirements.txt\n messaging_service/.venv/bin/python -m messaging_service.cli init\n messaging_service/.venv/bin/python -m messaging_service.cli run-api\n # or\n messaging_service/.venv/bin/python -m messaging_service.cli run-bot\n ```\n\n## Local LLM via Ollama (optional)\n\nRun Loomr with a local model through [Ollama](https://ollama.com/). The built-in plugin `ollama_assistant` exposes two commands:\n\n- `/ask <question>` \u2014 ask the local model a question.\n- `/coach <user reply>` \u2014 evaluate if a user's reply answers the current question and get a clearer re-ask.\n\n### Setup\n\n1) Install and pull a model\n\n```bash\ncurl -fsSL https://ollama.com/install.sh | sh\nollama pull llama3:8b\ncurl -sS http://127.0.0.1:11434/api/tags\n```\n\n2) Environment\n\nCopy `.env.example` to `.env` and set at least your Telegram token. Optional Ollama vars:\n\n```env\nOLLAMA_HOST=http://127.0.0.1:11434\nOLLAMA_MODEL=llama3:8b\nQUESTIONARY_PATH=messaging_service/config/questionary.md\n```\n\n3) Enable plugin\n\n`messaging_service/config/config.yaml` already includes `ollama_assistant` under `plugins.enabled` and an `ollama:` section.\n\n4) Run\n\n```bash\nmake venv && make install\nmake run-bot\n```\n\n### Usage examples\n\nDM or Group (with BotFather privacy ON, the bot only sees commands):\n\n```text\n/ask What plugins are enabled?\n/ask Summarize the last 5 messages.\n/coach Why should I choice ?\n```\n\nGroup privacy OFF allows the bot to receive normal messages, but by default this plugin only responds to `/ask` and `/coach`.\n\n## API overview\n\nFastAPI app in `messaging_service/api_server.py`.\n\n- Docs: `GET /docs`, `GET /redoc`\n- Product delivery: `POST /deliver`\n- TON verify (example): `POST /ton/verify`\n- Group upgrade (example): `POST /group/upgrade`\n\nAuth: set `DELIVER_BEARER` in `.env` and include `Authorization: Bearer <token>` for `/deliver`.\n\n## Roadmap\n\n- Additional adapters (WhatsApp/Instagram/etc.)\n- SDKs from OpenAPI (TypeScript + Python)\n- More built-in plugins and flow blocks\n- Docker image publish on tags (GHCR/Docker Hub)\n\n## License\n\nSource-available and free for non\u2011commercial use under the Prosperity Public License 3.0.0.\n\n- See: `LICENSE` (Prosperity-3.0.0)\n- Commercial licensing: `COMMERCIAL_LICENSE.md`\n- Third\u2011party notices: `THIRD-PARTY-NOTICES.md`\n\n## Contributing\n\nSee `CONTRIBUTING.md`. PRs and plugins welcome!\n\n## Community & Support\n\n- Telegram Support Group: https://t.me/+i1RDBKJv0U01OTQ0\n- Author: Kai Gartner \u2014 LinkedIn: https://linkedin.com/in/kaigartner \u2014 Instagram: https://instagram.com/kaigartner\n",
"bugtrack_url": null,
"license": "Prosperity Public License 3.0.0 (non-commercial)",
"summary": "Modular, event-driven messaging bot runtime with plugins, YAML flows, and built-in FastAPI.",
"version": "0.1.5",
"project_urls": {
"Documentation": "https://github.com/ThePhotoCodeGrapher/Loomr#readme",
"Homepage": "https://github.com/ThePhotoCodeGrapher/Loomr",
"Issues": "https://github.com/ThePhotoCodeGrapher/Loomr/issues",
"Support": "https://t.me/+i1RDBKJv0U01OTQ0"
},
"split_keywords": [
"telegram",
" bot",
" fastapi",
" plugins",
" event-bus",
" yaml",
" sdk"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "b2fec57ce7e7261e2b1be1ed067732e210f98f2c9d7e79458703f13d3f12523f",
"md5": "64780b2be5b6f0375a0f31b50cf9563f",
"sha256": "04ccadab4191bd33d3ab4b389a89013bed569e9c43291102782bc4cec6d27be4"
},
"downloads": -1,
"filename": "loomr-0.1.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "64780b2be5b6f0375a0f31b50cf9563f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 98391,
"upload_time": "2025-08-26T21:30:30",
"upload_time_iso_8601": "2025-08-26T21:30:30.486681Z",
"url": "https://files.pythonhosted.org/packages/b2/fe/c57ce7e7261e2b1be1ed067732e210f98f2c9d7e79458703f13d3f12523f/loomr-0.1.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "e3ca7475e5fc5bcbe99242ed575688670d925b4a2db88b1ae8ce0f88c56f3cf4",
"md5": "6c21261dd8c2d3aaf0825d7f80657c35",
"sha256": "6ab28ef0650267090bb027a6ce82011baeac83464df215f10b12c1e8b22c3b9e"
},
"downloads": -1,
"filename": "loomr-0.1.5.tar.gz",
"has_sig": false,
"md5_digest": "6c21261dd8c2d3aaf0825d7f80657c35",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 81806,
"upload_time": "2025-08-26T21:30:31",
"upload_time_iso_8601": "2025-08-26T21:30:31.504393Z",
"url": "https://files.pythonhosted.org/packages/e3/ca/7475e5fc5bcbe99242ed575688670d925b4a2db88b1ae8ce0f88c56f3cf4/loomr-0.1.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-26 21:30:31",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ThePhotoCodeGrapher",
"github_project": "Loomr#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "loomr"
}