![Kalavai logo](docs/docs/assets/icons/logo_no_background.png)
<div align="center">
![GitHub Release](https://img.shields.io/github/v/release/kalavai-net/kalavai-client) ![PyPI - Downloads](https://img.shields.io/pypi/dm/kalavai-client?style=social)
![GitHub contributors](https://img.shields.io/github/contributors/kalavai-net/kalavai-client) ![GitHub License](https://img.shields.io/github/license/kalavai-net/kalavai-client) ![GitHub Repo stars](https://img.shields.io/github/stars/kalavai-net/kalavai-client) [![Dynamic JSON Badge](https://img.shields.io/badge/dynamic/json?url=https%3A%2F%2Fdiscord.com%2Fapi%2Finvites%2FeA3sEWGB%3Fwith_counts%3Dtrue&query=%24.approximate_member_count&logo=discord&logoColor=white&label=Discord%20users&color=green)](https://discordapp.com/channels/1295009828623880313) [![Signup](https://img.shields.io/badge/Kalavai-Signup-brightgreen)](https://platform.kalavai.net)
</div>
⭐⭐⭐ **Kalavai and our LLM pools are open source, and free to use in both commercial and non-commercial purposes. If you find it useful, consider supporting us by [giving a star to our GitHub project](https://github.com/kalavai-net/kalavai-client), joining our [discord channel](https://discord.gg/HJ8FNapQ), follow our [Substack](https://kalavainet.substack.com/) and give us a [review on Product Hunt](https://www.producthunt.com/products/kalavai/reviews/new).**
# Kalavai: turn your devices into a scalable LLM platform
### Taming the adoption of Large Language Models
> Kalavai is an **open source** tool that turns **everyday devices** into your very own LLM platform. It aggregates resources from multiple machines, including desktops and laptops, and is **compatible with most model engines** to make LLM deployment and orchestration simple and reliable.
<div align="center">
<a href="https://www.producthunt.com/products/kalavai/reviews?utm_source=badge-product_review&utm_medium=badge&utm_souce=badge-kalavai" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/product_review.svg?product_id=720725&theme=neutral" alt="Kalavai - The first platform to crowdsource AI computation | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
</div>
## What can Kalavai do?
Kalavai's goal is to make using LLMs in real applications accessible and affordable to all. It's a _magic box_ that **integrates all the components required to make LLM useful in the age of massive computing**, from sourcing computing power, managing distributed infrastructure and storage, using industry-standard model engines and orchestration of LLMs.
### Aggregate multiple devices in an LLM pool
https://github.com/user-attachments/assets/4be59886-1b76-4400-ab5c-c803e3e414ec
### Deploy LLMs across the pool
https://github.com/user-attachments/assets/ea57a2ab-3924-4097-be2a-504e0988fbb1
### Single point of entry for all models (GUI + API)
https://github.com/user-attachments/assets/7df73bbc-d129-46aa-8ce5-0735177dedeb
### Self-hosted LLM pools
https://github.com/user-attachments/assets/0d2316f3-79ea-46ac-b41e-8ef720f52672
### News updates
- 27 January 2025: Support for accessing pools from remote computers
- 9 January 2025: Added support for [Aphrodite Engine](https://github.com/aphrodite-engine/aphrodite-engine) models
- 8 January 2025: Release of [a free, public, shared pool](/docs/docs/public_llm_pool.md) for community LLM deployment
- 24 December 2024: Release of [public BOINC pool](/docs/docs/boinc.md) to donate computing to scientific projects
- 23 December 2024: Release of [public petals swarm](/docs/docs/petals.md)
- 24 November 2024: Common pools with private user spaces
- 30 October 2024: Release of our [public pool platform](https://platform.kalavai.net)
### Support for LLM engines
We currently support out of the box the following LLM engines:
- [vLLM](https://docs.vllm.ai/en/latest/)
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
- [Aphrodite Engine](https://github.com/aphrodite-engine/aphrodite-engine)
- [Petals](https://github.com/bigscience-workshop/petals)
Coming soon:
- [exo](https://github.com/exo-explore/exo)
- [GPUstack](https://docs.gpustack.ai/0.4/overview/)
- [RayServe](https://docs.ray.io/en/latest/serve/index.html)
Not what you were looking for? [Tell us](https://github.com/kalavai-net/kalavai-client/issues) what engines you'd like to see.
> Kalavai is at an **early stage** of its development. We encourage people to use it and give us feedback! Although we are trying to minimise breaking changes, these may occur until we have a stable version (v1.0).
## Want to know more?
- Get a free [Kalavai account](https://platform.kalavai.net) and access unlimited AI.
- Full [documentation](https://kalavai-net.github.io/kalavai-client/) for the project.
- [Join our Substack](https://kalavainet.substack.com/) for updates and be part of our community
- [Join our discord community](https://discord.gg/6VJWGzxg)
## Getting started
The `kalavai` client is the main tool to interact with the Kalavai platform, to create and manage both local and public pools and also to interact with them (e.g. deploy models). Let's go over its installation.
From release **v0.5.0, you can now install `kalavai` client in non-worker computers**. You can run a pool on a set of machines and have the client on a remote computer from which you access the LLM pool. Because the client only requires having python installed, this means more computers are now supported to run it.
### Requirements for a worker machine
- A laptop, desktop or Virtual Machine
- Docker engine installed (for [linux](https://docs.docker.com/engine/install/), [Windows and MacOS](https://docs.docker.com/desktop/)) with [privilege access](https://docs.docker.com/engine/containers/run/#runtime-privilege-and-linux-capabilities).
### Requirements to run the client
- Python 3.10+
If you see the following error:
```bash
fatal error: Python.h: No such file or directory | #include <Python.h>
```
Make sure you also install python3-dev package. For ubuntu distros:
```bash
sudo apt install python3-dev
```
### Install the client
The client is a python package and can be installed with one command:
```bash
pip install kalavai-client
```
## Public LLM pools: crowdsource community resources
This is the **easiest and most powerful** way to experience Kalavai. It affords users the full resource capabilities of the community and access to all its deployed LLMs, via an [OpenAI-compatible endpoint](https://kalavai-net.github.io/kalavai-client/public_llm_pool/#single-api-endpoint) as well as a [UI-based playground](https://kalavai-net.github.io/kalavai-client/public_llm_pool/#ui-playground).
Check out [our guide](https://kalavai-net.github.io/kalavai-client/public_llm_pool/) on how to join and start deploying LLMs.
## Createa a local, private LLM pool
Kalavai is **free to use, no caps, for both commercial and non-commercial purposes**. All you need to get started is one or more computers that can see each other (i.e. within the same network), and you are good to go. If you wish to join computers in different locations / networks, check [managed kalavai](#public-pools-crowdsource-community-resources).
### 1. Start a seed node
Simply use the client to start your seed node:
```bash
kalavai pool start <pool-name>
```
Now you are ready to add worker nodes to this seed. To do so, generate a joining token:
```bash
$ kalavai pool token --user
Join token: <token>
```
### 2. Add worker nodes
Increase the power of your AI pool by inviting others to join.
Copy the joining token. On the worker node, run:
```bash
kalavai pool join <token>
```
### Enough already, let's run stuff!
Check our [examples](examples/) to put your new AI pool to good use!
- [Single node vLLM GPU LLM](examples/singlenode_gpu_vllm.md) deployment
- [Multi node vLLM GPU LLM](examples/multinode_gpu_vllm.md) deployment
- [Aphrodite-engine quantized LLM](examples/quantized_gpu_llm.md) deployment, including Kobold interface
- [Ray cluster](examples/ray_cluster.md) for distributed computation.
## Compatibility matrix
If your system is not currently supported, [open an issue](https://github.com/kalavai-net/kalavai-client/issues) and request it. We are expanding this list constantly.
### OS compatibility
Since **worker nodes** run inside docker, any machine that can run docker **should** be compatible with Kalavai. Here are instructions for [linux](https://docs.docker.com/engine/install/), [Windows](https://docs.docker.com/desktop/setup/install/windows-install/) and [MacOS](https://docs.docker.com/desktop/setup/install/mac-install/).
The kalavai client, which controls and access pools, can be installed on any machine that has python 3.10+.
### Hardware compatibility:
- `amd64` or `x86_64` CPU architecture
- NVIDIA GPU
- AMD and Intel GPUs are currently not supported ([interested in helping us test it?](https://kalavai-net.github.io/kalavai-client/compatibility/#help-testing-amd-gpus))
## Roadmap
- [x] Kalavai client on Linux
- [x] [TEMPLATE] Distributed LLM deployment
- [x] Kalavai client on Windows (with WSL2)
- [x] Public LLM pools
- [x] Self-hosted LLM pools
- [x] Collaborative LLM deployment
- [x] Ray cluster support
- [x] Kalavai client on Mac
- [ ] [TEMPLATE] [GPUStack](https://github.com/gpustack/gpustack) support
- [ ] [TEMPLATE] [exo](https://github.com/exo-explore/exo) support
- [ ] Support for AMD GPUs
- [x] Docker install path
Anything missing here? Give us a shout in the [discussion board](https://github.com/kalavai-net/kalavai-client/discussions)
## Contribute
- PR welcome!
- [Join the community](https://github.com/kalavai-net/kalavai-client/) and share ideas!
- Report [bugs, issues and new features](https://github.com/kalavai-net/kalavai-client/issues).
- Help improve our [compatibility matrix](#compatibility-matrix) by testing on different operative systems.
- [Follow our Substack channel](https://kalavainet.substack.com/) for news, guides and more.
- [Community integrations](https://github.com/kalavai-net/kube-watcher/tree/main/templates) are template jobs built by Kalavai and the community that makes deploying distributed workflows easy for users. Anyone can extend them and contribute to the repo.
### Star History
[![Star History Chart](https://api.star-history.com/svg?repos=kalavai-net/kalavai-client&type=Date)](https://star-history.com/#kalavai-net/kalavai-client&Date)
## Build from source
### Requirements
Python version <= 3.12.
```bash
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update
sudo apt install python3.10 python3.10-dev
virtualenv -p python3.10 env
source env/bin/activate
sudo apt install python3.10-venv python3.10-dev -y
pip install -e .[dev]
```
Build python wheels:
```bash
bash publish.sh build
```
### Unit tests
To run the unit tests, use:
```bash
python -m unittest
```
Raw data
{
"_id": null,
"home_page": null,
"name": "kalavai-client",
"maintainer": "Carlos Fernandez Musoles",
"docs_url": null,
"requires_python": "<3.12",
"maintainer_email": "carlos@kalavai.net",
"keywords": "LLM, platform",
"author": "Carlos Fernandez Musoles",
"author_email": "carlos@kalavai.net",
"download_url": "https://files.pythonhosted.org/packages/0d/ff/d297a141f2a3683f647e2066a8a9bf6dbe8dc75f5e2d71798f951054dd00/kalavai_client-0.5.7.tar.gz",
"platform": null,
"description": "![Kalavai logo](docs/docs/assets/icons/logo_no_background.png)\n\n<div align=\"center\">\n\n![GitHub Release](https://img.shields.io/github/v/release/kalavai-net/kalavai-client) ![PyPI - Downloads](https://img.shields.io/pypi/dm/kalavai-client?style=social)\n ![GitHub contributors](https://img.shields.io/github/contributors/kalavai-net/kalavai-client) ![GitHub License](https://img.shields.io/github/license/kalavai-net/kalavai-client) ![GitHub Repo stars](https://img.shields.io/github/stars/kalavai-net/kalavai-client) [![Dynamic JSON Badge](https://img.shields.io/badge/dynamic/json?url=https%3A%2F%2Fdiscord.com%2Fapi%2Finvites%2FeA3sEWGB%3Fwith_counts%3Dtrue&query=%24.approximate_member_count&logo=discord&logoColor=white&label=Discord%20users&color=green)](https://discordapp.com/channels/1295009828623880313) [![Signup](https://img.shields.io/badge/Kalavai-Signup-brightgreen)](https://platform.kalavai.net) \n\n</div>\n\n\u2b50\u2b50\u2b50 **Kalavai and our LLM pools are open source, and free to use in both commercial and non-commercial purposes. If you find it useful, consider supporting us by [giving a star to our GitHub project](https://github.com/kalavai-net/kalavai-client), joining our [discord channel](https://discord.gg/HJ8FNapQ), follow our [Substack](https://kalavainet.substack.com/) and give us a [review on Product Hunt](https://www.producthunt.com/products/kalavai/reviews/new).**\n\n\n# Kalavai: turn your devices into a scalable LLM platform\n\n### Taming the adoption of Large Language Models\n\n> Kalavai is an **open source** tool that turns **everyday devices** into your very own LLM platform. It aggregates resources from multiple machines, including desktops and laptops, and is **compatible with most model engines** to make LLM deployment and orchestration simple and reliable.\n\n<div align=\"center\">\n\n<a href=\"https://www.producthunt.com/products/kalavai/reviews?utm_source=badge-product_review&utm_medium=badge&utm_souce=badge-kalavai\" target=\"_blank\"><img src=\"https://api.producthunt.com/widgets/embed-image/v1/product_review.svg?product_id=720725&theme=neutral\" alt=\"Kalavai - The first platform to crowdsource AI computation | Product Hunt\" style=\"width: 250px; height: 54px;\" width=\"250\" height=\"54\" /></a>\n\n</div>\n\n\n## What can Kalavai do?\n\nKalavai's goal is to make using LLMs in real applications accessible and affordable to all. It's a _magic box_ that **integrates all the components required to make LLM useful in the age of massive computing**, from sourcing computing power, managing distributed infrastructure and storage, using industry-standard model engines and orchestration of LLMs. \n\n### Aggregate multiple devices in an LLM pool\n\nhttps://github.com/user-attachments/assets/4be59886-1b76-4400-ab5c-c803e3e414ec\n\n### Deploy LLMs across the pool\n\nhttps://github.com/user-attachments/assets/ea57a2ab-3924-4097-be2a-504e0988fbb1\n\n### Single point of entry for all models (GUI + API)\n\nhttps://github.com/user-attachments/assets/7df73bbc-d129-46aa-8ce5-0735177dedeb\n\n### Self-hosted LLM pools\n\nhttps://github.com/user-attachments/assets/0d2316f3-79ea-46ac-b41e-8ef720f52672\n\n\n### News updates\n\n- 27 January 2025: Support for accessing pools from remote computers\n- 9 January 2025: Added support for [Aphrodite Engine](https://github.com/aphrodite-engine/aphrodite-engine) models\n- 8 January 2025: Release of [a free, public, shared pool](/docs/docs/public_llm_pool.md) for community LLM deployment\n- 24 December 2024: Release of [public BOINC pool](/docs/docs/boinc.md) to donate computing to scientific projects\n- 23 December 2024: Release of [public petals swarm](/docs/docs/petals.md)\n- 24 November 2024: Common pools with private user spaces\n- 30 October 2024: Release of our [public pool platform](https://platform.kalavai.net)\n\n\n### Support for LLM engines\n\nWe currently support out of the box the following LLM engines:\n\n- [vLLM](https://docs.vllm.ai/en/latest/)\n- [llama.cpp](https://github.com/ggerganov/llama.cpp)\n- [Aphrodite Engine](https://github.com/aphrodite-engine/aphrodite-engine)\n- [Petals](https://github.com/bigscience-workshop/petals)\n\nComing soon:\n\n- [exo](https://github.com/exo-explore/exo)\n- [GPUstack](https://docs.gpustack.ai/0.4/overview/)\n- [RayServe](https://docs.ray.io/en/latest/serve/index.html)\n\nNot what you were looking for? [Tell us](https://github.com/kalavai-net/kalavai-client/issues) what engines you'd like to see.\n\n\n> Kalavai is at an **early stage** of its development. We encourage people to use it and give us feedback! Although we are trying to minimise breaking changes, these may occur until we have a stable version (v1.0).\n\n\n## Want to know more?\n\n- Get a free [Kalavai account](https://platform.kalavai.net) and access unlimited AI.\n- Full [documentation](https://kalavai-net.github.io/kalavai-client/) for the project.\n- [Join our Substack](https://kalavainet.substack.com/) for updates and be part of our community\n- [Join our discord community](https://discord.gg/6VJWGzxg)\n\n\n## Getting started\n\nThe `kalavai` client is the main tool to interact with the Kalavai platform, to create and manage both local and public pools and also to interact with them (e.g. deploy models). Let's go over its installation. \n\nFrom release **v0.5.0, you can now install `kalavai` client in non-worker computers**. You can run a pool on a set of machines and have the client on a remote computer from which you access the LLM pool. Because the client only requires having python installed, this means more computers are now supported to run it.\n\n\n### Requirements for a worker machine\n\n- A laptop, desktop or Virtual Machine\n- Docker engine installed (for [linux](https://docs.docker.com/engine/install/), [Windows and MacOS](https://docs.docker.com/desktop/)) with [privilege access](https://docs.docker.com/engine/containers/run/#runtime-privilege-and-linux-capabilities).\n\n\n### Requirements to run the client\n\n- Python 3.10+\n\nIf you see the following error:\n\n```bash\nfatal error: Python.h: No such file or directory | #include <Python.h>\n```\n\nMake sure you also install python3-dev package. For ubuntu distros:\n\n```bash\nsudo apt install python3-dev\n```\n\n\n### Install the client\n\nThe client is a python package and can be installed with one command:\n\n```bash\npip install kalavai-client\n```\n\n## Public LLM pools: crowdsource community resources\n\nThis is the **easiest and most powerful** way to experience Kalavai. It affords users the full resource capabilities of the community and access to all its deployed LLMs, via an [OpenAI-compatible endpoint](https://kalavai-net.github.io/kalavai-client/public_llm_pool/#single-api-endpoint) as well as a [UI-based playground](https://kalavai-net.github.io/kalavai-client/public_llm_pool/#ui-playground).\n\nCheck out [our guide](https://kalavai-net.github.io/kalavai-client/public_llm_pool/) on how to join and start deploying LLMs.\n\n\n## Createa a local, private LLM pool\n\nKalavai is **free to use, no caps, for both commercial and non-commercial purposes**. All you need to get started is one or more computers that can see each other (i.e. within the same network), and you are good to go. If you wish to join computers in different locations / networks, check [managed kalavai](#public-pools-crowdsource-community-resources).\n\n### 1. Start a seed node\n\nSimply use the client to start your seed node:\n\n```bash\nkalavai pool start <pool-name>\n```\n\nNow you are ready to add worker nodes to this seed. To do so, generate a joining token:\n```bash\n$ kalavai pool token --user\n\nJoin token: <token>\n```\n\n### 2. Add worker nodes\n\nIncrease the power of your AI pool by inviting others to join.\n\nCopy the joining token. On the worker node, run:\n\n```bash\nkalavai pool join <token>\n```\n\n### Enough already, let's run stuff!\n\nCheck our [examples](examples/) to put your new AI pool to good use!\n- [Single node vLLM GPU LLM](examples/singlenode_gpu_vllm.md) deployment\n- [Multi node vLLM GPU LLM](examples/multinode_gpu_vllm.md) deployment\n- [Aphrodite-engine quantized LLM](examples/quantized_gpu_llm.md) deployment, including Kobold interface\n- [Ray cluster](examples/ray_cluster.md) for distributed computation.\n\n\n## Compatibility matrix\n\nIf your system is not currently supported, [open an issue](https://github.com/kalavai-net/kalavai-client/issues) and request it. We are expanding this list constantly.\n\n### OS compatibility\n\nSince **worker nodes** run inside docker, any machine that can run docker **should** be compatible with Kalavai. Here are instructions for [linux](https://docs.docker.com/engine/install/), [Windows](https://docs.docker.com/desktop/setup/install/windows-install/) and [MacOS](https://docs.docker.com/desktop/setup/install/mac-install/).\n\nThe kalavai client, which controls and access pools, can be installed on any machine that has python 3.10+.\n\n\n### Hardware compatibility:\n\n- `amd64` or `x86_64` CPU architecture\n- NVIDIA GPU\n- AMD and Intel GPUs are currently not supported ([interested in helping us test it?](https://kalavai-net.github.io/kalavai-client/compatibility/#help-testing-amd-gpus))\n\n\n## Roadmap\n\n- [x] Kalavai client on Linux\n- [x] [TEMPLATE] Distributed LLM deployment\n- [x] Kalavai client on Windows (with WSL2)\n- [x] Public LLM pools\n- [x] Self-hosted LLM pools\n- [x] Collaborative LLM deployment\n- [x] Ray cluster support\n- [x] Kalavai client on Mac\n- [ ] [TEMPLATE] [GPUStack](https://github.com/gpustack/gpustack) support\n- [ ] [TEMPLATE] [exo](https://github.com/exo-explore/exo) support\n- [ ] Support for AMD GPUs\n- [x] Docker install path\n\n\nAnything missing here? Give us a shout in the [discussion board](https://github.com/kalavai-net/kalavai-client/discussions)\n\n\n## Contribute\n\n- PR welcome!\n- [Join the community](https://github.com/kalavai-net/kalavai-client/) and share ideas!\n- Report [bugs, issues and new features](https://github.com/kalavai-net/kalavai-client/issues).\n- Help improve our [compatibility matrix](#compatibility-matrix) by testing on different operative systems.\n- [Follow our Substack channel](https://kalavainet.substack.com/) for news, guides and more.\n- [Community integrations](https://github.com/kalavai-net/kube-watcher/tree/main/templates) are template jobs built by Kalavai and the community that makes deploying distributed workflows easy for users. Anyone can extend them and contribute to the repo.\n\n### Star History\n\n[![Star History Chart](https://api.star-history.com/svg?repos=kalavai-net/kalavai-client&type=Date)](https://star-history.com/#kalavai-net/kalavai-client&Date)\n\n\n## Build from source\n\n### Requirements\n\nPython version <= 3.12.\n\n```bash\nsudo add-apt-repository ppa:deadsnakes/ppa\nsudo apt update\nsudo apt install python3.10 python3.10-dev\nvirtualenv -p python3.10 env\nsource env/bin/activate\nsudo apt install python3.10-venv python3.10-dev -y\npip install -e .[dev]\n```\n\nBuild python wheels:\n```bash\nbash publish.sh build\n```\n\n\n### Unit tests\n\nTo run the unit tests, use:\n\n```bash\npython -m unittest\n```\n\n\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Client app for kalavai platform",
"version": "0.5.7",
"project_urls": {
"Homepage": "https://platform.kalavai.net",
"Website": "https://kalavai.net"
},
"split_keywords": [
"llm",
" platform"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "d8cf38bc2e6fbd5fdc9fbeae3e76bcec1e377d631b8ae35c8136abca7313936e",
"md5": "afdae85bf7c1b74e21daf34b903a9c70",
"sha256": "7205065873deb85141d81feffb0a67e21b3abd1c1450668bc7fd7b32599351ff"
},
"downloads": -1,
"filename": "kalavai_client-0.5.7-py2.py3-none-any.whl",
"has_sig": false,
"md5_digest": "afdae85bf7c1b74e21daf34b903a9c70",
"packagetype": "bdist_wheel",
"python_version": "py2.py3",
"requires_python": "<3.12",
"size": 36030,
"upload_time": "2025-01-27T17:09:36",
"upload_time_iso_8601": "2025-01-27T17:09:36.715805Z",
"url": "https://files.pythonhosted.org/packages/d8/cf/38bc2e6fbd5fdc9fbeae3e76bcec1e377d631b8ae35c8136abca7313936e/kalavai_client-0.5.7-py2.py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "0dffd297a141f2a3683f647e2066a8a9bf6dbe8dc75f5e2d71798f951054dd00",
"md5": "4811141abef4a455be5eb7cd3dc9ffb4",
"sha256": "e2cc997eaadebadce9ba684ec463f45a4dc3c02d3237d86a31c8662b26112eff"
},
"downloads": -1,
"filename": "kalavai_client-0.5.7.tar.gz",
"has_sig": false,
"md5_digest": "4811141abef4a455be5eb7cd3dc9ffb4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.12",
"size": 35275,
"upload_time": "2025-01-27T17:09:38",
"upload_time_iso_8601": "2025-01-27T17:09:38.076706Z",
"url": "https://files.pythonhosted.org/packages/0d/ff/d297a141f2a3683f647e2066a8a9bf6dbe8dc75f5e2d71798f951054dd00/kalavai_client-0.5.7.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-01-27 17:09:38",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "kalavai-client"
}