Name | volt-llm-client JSON |
Version |
0.1.2
JSON |
| download |
home_page | https://github.com/stuarttempleton/volt-llm-client |
Summary | A lightweight client for local LLM APIs like Ollama or OpenWebUI |
upload_time | 2025-09-14 20:19:24 |
maintainer | None |
docs_url | None |
author | Voltur |
requires_python | >=3.7 |
license | MIT License
Copyright (c) 2025 Voltur
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
|
keywords |
logging
structured
volt
|
VCS |
 |
bugtrack_url |
|
requirements |
requests
volt-logger
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# volt-llm-client
**volt-llm-client** is a lightweight Python client for interacting with local LLM APIs such as [Ollama](https://ollama.com/) or [OpenWebUI](https://github.com/open-webui/open-webui).
It supports prompt completion, multi-turn conversations, and model listing — perfect for scripts, prototypes, or CLI tools.
---
## Features
- Send single prompts or full conversations
- Compatible with Ollama/OpenWebUI-style APIs
- Token-based authentication support
- Clean log output using [volt-logger](https://github.com/stuarttempleton/volt-logger)
- Minimal dependencies (`requests` only)
---
## Installation
Install both `volt-llm-client` and its logging dependency:
```bash
pip install volt-llm-client
````
For local development:
```bash
git clone https://github.com/stuarttempleton/volt-llm-client.git
cd volt-llm-client
pip install -r requirements.txt
```
---
## Usage Example
```python
import os
from voltllmclient import LLMClient
llm = LLMClient(
token=os.getenv("LLM_API_TOKEN"),
model="Gemma3"
)
reply = llm.send_prompt("What is the capital of France?")
print(reply)
```
You can also send a full conversation:
```python
messages = [
{ "role": "system", "content": "You are a helpful assistant." },
{ "role": "user", "content": "Who won the World Cup in 2018?" }
]
response = llm.send_conversation(messages)
print(response)
```
---
## Maintaining Context with LLMConversation
```python
from voltllmclient import LLMConversation
import os
conv = LLMConversation(model="gemma3", token=os.getenv("LLM_API_TOKEN"))
response = conv.send("What is quantum computing?")
print(response)
response = conv.send_with_full_context("How is it different from classical computing?")
print(response)
conv.save_transcript("session.json")
```
---
## Environment Variables
Set your API token via environment variable:
**Unix/macOS:**
```bash
export LLM_API_TOKEN=your_token_here
```
**PowerShell:**
```powershell
$env:LLM_API_TOKEN = "your_token_here"
```
---
## License
[MIT](LICENSE)
Raw data
{
"_id": null,
"home_page": "https://github.com/stuarttempleton/volt-llm-client",
"name": "volt-llm-client",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "logging, structured, volt",
"author": "Voltur",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/6e/9b/3d4e8cce1e781f4b77337b73a6a6553e680abf4c7868752080875fa21f36/volt_llm_client-0.1.2.tar.gz",
"platform": null,
"description": "# volt-llm-client\n\n**volt-llm-client** is a lightweight Python client for interacting with local LLM APIs such as [Ollama](https://ollama.com/) or [OpenWebUI](https://github.com/open-webui/open-webui). \nIt supports prompt completion, multi-turn conversations, and model listing \u2014 perfect for scripts, prototypes, or CLI tools.\n\n---\n\n## Features\n\n- Send single prompts or full conversations\n- Compatible with Ollama/OpenWebUI-style APIs\n- Token-based authentication support\n- Clean log output using [volt-logger](https://github.com/stuarttempleton/volt-logger)\n- Minimal dependencies (`requests` only)\n\n---\n\n## Installation\n\nInstall both `volt-llm-client` and its logging dependency:\n\n```bash\npip install volt-llm-client\n````\n\nFor local development:\n\n```bash\ngit clone https://github.com/stuarttempleton/volt-llm-client.git\ncd volt-llm-client\npip install -r requirements.txt\n```\n\n---\n\n## Usage Example\n\n```python\nimport os\nfrom voltllmclient import LLMClient\n\nllm = LLMClient(\n token=os.getenv(\"LLM_API_TOKEN\"),\n model=\"Gemma3\"\n)\n\nreply = llm.send_prompt(\"What is the capital of France?\")\nprint(reply)\n```\n\nYou can also send a full conversation:\n\n```python\nmessages = [\n { \"role\": \"system\", \"content\": \"You are a helpful assistant.\" },\n { \"role\": \"user\", \"content\": \"Who won the World Cup in 2018?\" }\n]\n\nresponse = llm.send_conversation(messages)\nprint(response)\n```\n\n---\n\n## Maintaining Context with LLMConversation\n\n```python\nfrom voltllmclient import LLMConversation\nimport os\n\nconv = LLMConversation(model=\"gemma3\", token=os.getenv(\"LLM_API_TOKEN\"))\n\nresponse = conv.send(\"What is quantum computing?\")\nprint(response)\n\nresponse = conv.send_with_full_context(\"How is it different from classical computing?\")\nprint(response)\n\nconv.save_transcript(\"session.json\")\n```\n\n---\n\n## Environment Variables\n\nSet your API token via environment variable:\n\n**Unix/macOS:**\n\n```bash\nexport LLM_API_TOKEN=your_token_here\n```\n\n**PowerShell:**\n\n```powershell\n$env:LLM_API_TOKEN = \"your_token_here\"\n```\n\n---\n\n## License\n\n[MIT](LICENSE)\n\n",
"bugtrack_url": null,
"license": "MIT License\n \n Copyright (c) 2025 Voltur\n \n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n \n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n \n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE.\n ",
"summary": "A lightweight client for local LLM APIs like Ollama or OpenWebUI",
"version": "0.1.2",
"project_urls": {
"Bug Tracker": "https://github.com/stuarttempleton/volt-llm-client/issues",
"Homepage": "https://github.com/stuarttempleton/volt-llm-client",
"Repository": "https://github.com/stuarttempleton/volt-llm-client"
},
"split_keywords": [
"logging",
" structured",
" volt"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "e4a8893a3c99f2185b2497dcf6a46d8e79ef33a70fb13f460aa0af9a90ef4c0a",
"md5": "78345334898c3184343456c4f61c66dd",
"sha256": "a3a8c05688147aaacdb3afa5ec001e20e6e28551a347baea74869ef02ec05202"
},
"downloads": -1,
"filename": "volt_llm_client-0.1.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "78345334898c3184343456c4f61c66dd",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 6520,
"upload_time": "2025-09-14T20:19:23",
"upload_time_iso_8601": "2025-09-14T20:19:23.132393Z",
"url": "https://files.pythonhosted.org/packages/e4/a8/893a3c99f2185b2497dcf6a46d8e79ef33a70fb13f460aa0af9a90ef4c0a/volt_llm_client-0.1.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "6e9b3d4e8cce1e781f4b77337b73a6a6553e680abf4c7868752080875fa21f36",
"md5": "cfdde9e18c6a51d42027e3f3a4174266",
"sha256": "1af58119c3aff06bca19e12cac53235f2b2a64ab0d2c1781a95cf9f5db7f1711"
},
"downloads": -1,
"filename": "volt_llm_client-0.1.2.tar.gz",
"has_sig": false,
"md5_digest": "cfdde9e18c6a51d42027e3f3a4174266",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 6717,
"upload_time": "2025-09-14T20:19:24",
"upload_time_iso_8601": "2025-09-14T20:19:24.607532Z",
"url": "https://files.pythonhosted.org/packages/6e/9b/3d4e8cce1e781f4b77337b73a6a6553e680abf4c7868752080875fa21f36/volt_llm_client-0.1.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-14 20:19:24",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "stuarttempleton",
"github_project": "volt-llm-client",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "requests",
"specs": []
},
{
"name": "volt-logger",
"specs": [
[
">=",
"0.1.0"
]
]
}
],
"lcname": "volt-llm-client"
}