Name | llmswap JSON |
Version |
1.0.3
JSON |
| download |
home_page | None |
Summary | Simple interface for any LLM provider |
upload_time | 2025-08-03 17:26:41 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.8 |
license | MIT License
Copyright (c) 2025 Sreenath Menon
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. |
keywords |
ai
anthropic
chatbot
claude
gemini
gpt
llm
ollama
openai
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# llmswap
[](https://badge.fury.io/py/llmswap)
[](https://www.python.org/downloads/)
[](https://opensource.org/licenses/MIT)
Simple interface for any LLM provider. Switch between Anthropic, OpenAI, Google, and local models with one line of code.
## Why llmswap?
- Easy switching between LLM providers
- Zero configuration - works with environment variables
- Automatic fallback when providers fail
- Simple API - same interface for all providers
## Quick Start
### Installation
```bash
pip install llmswap
```
### Basic Usage
```python
from llmswap import LLMClient
client = LLMClient()
response = client.query("What is Python?")
print(response.content)
```
### Set API Keys
```bash
# Choose your provider
export ANTHROPIC_API_KEY="your-key-here"
# OR
export OPENAI_API_KEY="your-key-here"
# OR
export GEMINI_API_KEY="your-key-here"
# OR run Ollama locally
```
## Usage Examples
### Provider Auto-Detection
```python
from llmswap import LLMClient
client = LLMClient()
print(f"Using: {client.get_current_provider()}")
response = client.query("Explain machine learning")
print(response.content)
```
### Specify Provider
```python
client = LLMClient(provider="anthropic")
client = LLMClient(provider="openai")
client = LLMClient(provider="gemini")
client = LLMClient(provider="ollama")
```
### Custom Models
```python
client = LLMClient(provider="anthropic", model="claude-3-opus-20240229")
client = LLMClient(provider="openai", model="gpt-4")
client = LLMClient(provider="gemini", model="gemini-1.5-pro")
```
### Provider Switching
```python
client = LLMClient(provider="anthropic")
client.set_provider("openai")
client.set_provider("gemini", model="gemini-1.5-flash")
```
### Response Details
```python
response = client.query("What is OpenStack?")
print(f"Content: {response.content}")
print(f"Provider: {response.provider}")
print(f"Model: {response.model}")
print(f"Latency: {response.latency:.2f}s")
```
### Automatic Fallback
```python
client = LLMClient(fallback=True)
response = client.query("Hello world")
print(f"Succeeded with: {response.provider}")
```
## Supported Providers
| Provider | Models | Setup |
|----------|---------|-------|
| **Anthropic** | Claude 3 (Sonnet, Haiku, Opus) | `export ANTHROPIC_API_KEY=...` |
| **OpenAI** | GPT-3.5, GPT-4, GPT-4o | `export OPENAI_API_KEY=...` |
| **Google** | Gemini 1.5 (Flash, Pro) | `export GEMINI_API_KEY=...` |
| **Ollama** | Llama, Mistral, etc. | Run Ollama locally |
## Real-World Example
### Chatbot Integration
```python
from llmswap import LLMClient
class SimpleChatbot:
def __init__(self):
self.llm = LLMClient()
def chat(self, message):
response = self.llm.query(f"User: {message}\nAssistant:")
return response.content
def get_provider(self):
return f"Using {self.llm.get_current_provider()}"
# Usage
bot = SimpleChatbot()
print(bot.chat("Hello!"))
print(bot.get_provider())
```
### Migration from Existing Code
```python
# BEFORE: Direct provider usage
import openai
client = openai.OpenAI()
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hello"}]
)
content = response.choices[0].message.content
# AFTER: llmswap (works with any provider!)
from llmswap import LLMClient
client = LLMClient()
response = client.query("Hello")
content = response.content
```
## Configuration
### Environment Variables
```bash
# API Keys (set at least one)
export ANTHROPIC_API_KEY="your-anthropic-key"
export OPENAI_API_KEY="your-openai-key"
export GEMINI_API_KEY="your-gemini-key"
# Ollama (if using local models)
export OLLAMA_URL="http://localhost:11434" # default
```
### Programmatic Configuration
```python
# With API key
client = LLMClient(
provider="anthropic",
api_key="your-key-here"
)
# With custom model
client = LLMClient(
provider="openai",
model="gpt-4-turbo-preview"
)
# Disable fallback
client = LLMClient(fallback=False)
```
## Advanced Features
### Check Available Providers
```python
client = LLMClient()
# List configured providers
available = client.list_available_providers()
print(f"Available: {available}")
# Check specific provider
if client.is_provider_available("anthropic"):
client.set_provider("anthropic")
```
## License
MIT License - see [LICENSE](LICENSE) file for details.
## Links
- **GitHub**: https://github.com/sreenathmmenon/llmswap
- **PyPI**: https://pypi.org/project/llmswap/
- **Issues**: https://github.com/sreenathmmenon/llmswap/issues
---
Star this repo if llmswap helps simplify your LLM integration.
Raw data
{
"_id": null,
"home_page": null,
"name": "llmswap",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "ai, anthropic, chatbot, claude, gemini, gpt, llm, ollama, openai",
"author": null,
"author_email": "Sreenath Menon <zreenathmenon@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/fe/2f/964284c01a42fb2800e81e1a92c3c0e3f3063062e70a248dee1d3ad6af45/llmswap-1.0.3.tar.gz",
"platform": null,
"description": "# llmswap\n\n[](https://badge.fury.io/py/llmswap)\n[](https://www.python.org/downloads/)\n[](https://opensource.org/licenses/MIT)\n\nSimple interface for any LLM provider. Switch between Anthropic, OpenAI, Google, and local models with one line of code.\n\n## Why llmswap?\n\n- Easy switching between LLM providers\n- Zero configuration - works with environment variables \n- Automatic fallback when providers fail\n- Simple API - same interface for all providers\n\n## Quick Start\n\n### Installation\n```bash\npip install llmswap\n```\n\n### Basic Usage\n```python\nfrom llmswap import LLMClient\n\nclient = LLMClient()\nresponse = client.query(\"What is Python?\")\nprint(response.content)\n```\n\n### Set API Keys\n```bash\n# Choose your provider\nexport ANTHROPIC_API_KEY=\"your-key-here\"\n# OR\nexport OPENAI_API_KEY=\"your-key-here\" \n# OR\nexport GEMINI_API_KEY=\"your-key-here\"\n# OR run Ollama locally\n```\n\n## Usage Examples\n\n### Provider Auto-Detection\n```python\nfrom llmswap import LLMClient\n\nclient = LLMClient()\nprint(f\"Using: {client.get_current_provider()}\")\n\nresponse = client.query(\"Explain machine learning\")\nprint(response.content)\n```\n\n### Specify Provider \n```python\nclient = LLMClient(provider=\"anthropic\")\nclient = LLMClient(provider=\"openai\")\nclient = LLMClient(provider=\"gemini\")\nclient = LLMClient(provider=\"ollama\")\n```\n\n### Custom Models\n```python\nclient = LLMClient(provider=\"anthropic\", model=\"claude-3-opus-20240229\")\nclient = LLMClient(provider=\"openai\", model=\"gpt-4\")\nclient = LLMClient(provider=\"gemini\", model=\"gemini-1.5-pro\")\n```\n\n### Provider Switching\n```python\nclient = LLMClient(provider=\"anthropic\")\n\nclient.set_provider(\"openai\")\nclient.set_provider(\"gemini\", model=\"gemini-1.5-flash\")\n```\n\n### Response Details\n```python\nresponse = client.query(\"What is OpenStack?\")\n\nprint(f\"Content: {response.content}\")\nprint(f\"Provider: {response.provider}\")\nprint(f\"Model: {response.model}\")\nprint(f\"Latency: {response.latency:.2f}s\")\n```\n\n### Automatic Fallback\n```python\nclient = LLMClient(fallback=True)\nresponse = client.query(\"Hello world\")\nprint(f\"Succeeded with: {response.provider}\")\n```\n\n\n## Supported Providers\n\n| Provider | Models | Setup |\n|----------|---------|-------|\n| **Anthropic** | Claude 3 (Sonnet, Haiku, Opus) | `export ANTHROPIC_API_KEY=...` |\n| **OpenAI** | GPT-3.5, GPT-4, GPT-4o | `export OPENAI_API_KEY=...` |\n| **Google** | Gemini 1.5 (Flash, Pro) | `export GEMINI_API_KEY=...` |\n| **Ollama** | Llama, Mistral, etc. | Run Ollama locally |\n\n## Real-World Example\n\n### Chatbot Integration\n```python\nfrom llmswap import LLMClient\n\nclass SimpleChatbot:\n def __init__(self):\n self.llm = LLMClient()\n \n def chat(self, message):\n response = self.llm.query(f\"User: {message}\\nAssistant:\")\n return response.content\n \n def get_provider(self):\n return f\"Using {self.llm.get_current_provider()}\"\n\n# Usage\nbot = SimpleChatbot()\nprint(bot.chat(\"Hello!\"))\nprint(bot.get_provider())\n```\n\n### Migration from Existing Code\n```python\n# BEFORE: Direct provider usage\nimport openai\nclient = openai.OpenAI()\nresponse = client.chat.completions.create(\n model=\"gpt-3.5-turbo\",\n messages=[{\"role\": \"user\", \"content\": \"Hello\"}]\n)\ncontent = response.choices[0].message.content\n\n# AFTER: llmswap (works with any provider!)\nfrom llmswap import LLMClient\nclient = LLMClient()\nresponse = client.query(\"Hello\")\ncontent = response.content\n```\n\n## Configuration\n\n### Environment Variables\n```bash\n# API Keys (set at least one)\nexport ANTHROPIC_API_KEY=\"your-anthropic-key\"\nexport OPENAI_API_KEY=\"your-openai-key\"\nexport GEMINI_API_KEY=\"your-gemini-key\"\n\n# Ollama (if using local models)\nexport OLLAMA_URL=\"http://localhost:11434\" # default\n```\n\n### Programmatic Configuration\n```python\n# With API key\nclient = LLMClient(\n provider=\"anthropic\", \n api_key=\"your-key-here\"\n)\n\n# With custom model\nclient = LLMClient(\n provider=\"openai\",\n model=\"gpt-4-turbo-preview\"\n)\n\n# Disable fallback\nclient = LLMClient(fallback=False)\n```\n\n## Advanced Features\n\n### Check Available Providers\n```python\nclient = LLMClient()\n\n# List configured providers\navailable = client.list_available_providers()\nprint(f\"Available: {available}\")\n\n# Check specific provider\nif client.is_provider_available(\"anthropic\"):\n client.set_provider(\"anthropic\")\n```\n\n## License\n\nMIT License - see [LICENSE](LICENSE) file for details.\n\n## Links\n\n- **GitHub**: https://github.com/sreenathmmenon/llmswap\n- **PyPI**: https://pypi.org/project/llmswap/\n- **Issues**: https://github.com/sreenathmmenon/llmswap/issues\n\n---\n\nStar this repo if llmswap helps simplify your LLM integration.",
"bugtrack_url": null,
"license": "MIT License\n \n Copyright (c) 2025 Sreenath Menon\n \n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n \n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n \n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE.",
"summary": "Simple interface for any LLM provider",
"version": "1.0.3",
"project_urls": {
"Changelog": "https://github.com/sreenathmmenon/llmswap/blob/main/CHANGELOG.md",
"Documentation": "https://github.com/sreenathmmenon/llmswap#readme",
"Homepage": "https://github.com/sreenathmmenon/llmswap",
"Issues": "https://github.com/sreenathmmenon/llmswap/issues",
"Repository": "https://github.com/sreenathmmenon/llmswap"
},
"split_keywords": [
"ai",
" anthropic",
" chatbot",
" claude",
" gemini",
" gpt",
" llm",
" ollama",
" openai"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "74e415d006aa1226d4a930586937c6c580ae9d4cdbb7d061786c37c6ef0785c3",
"md5": "0dc273260efe46b81b283bf5776a1ea5",
"sha256": "02de1b092ac3e6e60753c4094fda6c972ea2a8060f728748747cae1c6acdb1e9"
},
"downloads": -1,
"filename": "llmswap-1.0.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "0dc273260efe46b81b283bf5776a1ea5",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 9094,
"upload_time": "2025-08-03T17:26:40",
"upload_time_iso_8601": "2025-08-03T17:26:40.175033Z",
"url": "https://files.pythonhosted.org/packages/74/e4/15d006aa1226d4a930586937c6c580ae9d4cdbb7d061786c37c6ef0785c3/llmswap-1.0.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "fe2f964284c01a42fb2800e81e1a92c3c0e3f3063062e70a248dee1d3ad6af45",
"md5": "555386ba49332b477210db8ab9591d86",
"sha256": "9c14a385b7b2b60270c2c0e49b829ca2f69979ca99eacc0de496b1543995f99a"
},
"downloads": -1,
"filename": "llmswap-1.0.3.tar.gz",
"has_sig": false,
"md5_digest": "555386ba49332b477210db8ab9591d86",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 8475,
"upload_time": "2025-08-03T17:26:41",
"upload_time_iso_8601": "2025-08-03T17:26:41.954967Z",
"url": "https://files.pythonhosted.org/packages/fe/2f/964284c01a42fb2800e81e1a92c3c0e3f3063062e70a248dee1d3ad6af45/llmswap-1.0.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-03 17:26:41",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "sreenathmmenon",
"github_project": "llmswap",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "llmswap"
}