Name | oblix JSON |
Version |
0.1.2
JSON |
| download |
home_page | None |
Summary | AI Model Orchestration SDK |
upload_time | 2025-04-09 02:17:56 |
maintainer | None |
docs_url | None |
author | mdstch17 |
requires_python | >=3.8 |
license | None |
keywords |
ai
sdk
model
orchestration
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Oblix: AI Orchestration SDK
Oblix is an AI orchestration SDK that intelligently routes between local and cloud models based on system conditions, providing a unified interface regardless of the underlying model provider.
For complete documentation, please visit: [documentation.oblix.ai](https://documentation.oblix.ai)
## Features
- **Unified Model Interface**: Consistent API across different model providers (OpenAI, Claude, Ollama)
- **Intelligent Orchestration**: Automatic switching between local and cloud models based on system resources and connectivity
- **Session Management**: Create, load, export, and import conversation sessions
- **Streaming Support**: Real-time token streaming
- **Extensible Agent System**: Custom monitoring and decision-making policies
- **OpenAI-Compatible API**: Drop-in compatibility with applications built for OpenAI's API
## Getting Started
### Installation
```bash
pip install oblix
```
### Basic Usage
```python
from oblix import OblixClient, ModelType
# Initialize client
client = OblixClient()
# Connect to models
await client.hook_model(ModelType.OLLAMA, "llama2", endpoint="http://localhost:11434")
await client.hook_model(ModelType.OPENAI, "gpt-3.5-turbo", api_key="your_openai_key")
# Add monitoring agents
from oblix.agents.resource_monitor import ResourceMonitor
from oblix.agents.connectivity import ConnectivityAgent
client.hook_agent(ResourceMonitor())
client.hook_agent(ConnectivityAgent())
# Execute prompts (with automatic model selection)
response = await client.execute("Your prompt here")
# Stream responses
await client.execute_streaming("Your prompt here")
# Manage sessions
session_id = await client.create_session("Chat Title")
```
## OpenAI-Compatible API
Oblix provides an OpenAI-compatible endpoint that allows using the standard OpenAI Python client with Oblix's orchestration capabilities:
### Starting the Server
```bash
# Start the Oblix server
uvicorn oblix.main:app --host 0.0.0.0 --port 8000
```
### Client Usage
```python
from openai import OpenAI
# Point to local Oblix server
client = OpenAI(
base_url="http://localhost:8000/v1",
api_key="any-value" # Authentication not required
)
# Use just like the OpenAI client
completion = client.chat.completions.create(
model="ollama:llama2", # Use format "provider:model" or just "model_name"
messages=[
{"role": "system", "content": "You are a helpful AI assistant."},
{"role": "user", "content": "Hello world!"}
],
temperature=0.7,
)
# Access response
print(completion.choices[0].message.content)
# Streaming also works
stream = client.chat.completions.create(
model="openai:gpt-3.5-turbo",
messages=[{"role": "user", "content": "Write a poem"}],
stream=True,
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
```
## Orchestration Features
The OpenAI-compatible endpoint supports:
- Specifying models in "provider:name" format (e.g., "openai:gpt-3.5-turbo", "ollama:llama2")
- Using a generic model identifier to let Oblix automatically select the best model
- Streaming responses
- Full context handling for conversations
## Examples
Check the `examples/` directory for more usage examples:
- `demo.py`: Basic usage of the Oblix client
- `connectivity_demo.py`: Demonstrates automatic switching based on network conditions
- `session_management_demo.py`: Shows session creation and management
- `openai_compatibility_demo.py`: Shows how to use the OpenAI client with Oblix
Raw data
{
"_id": null,
"home_page": null,
"name": "oblix",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "ai sdk model orchestration",
"author": "mdstch17",
"author_email": "team@oblix.ai",
"download_url": null,
"platform": null,
"description": "# Oblix: AI Orchestration SDK\n\nOblix is an AI orchestration SDK that intelligently routes between local and cloud models based on system conditions, providing a unified interface regardless of the underlying model provider.\n\nFor complete documentation, please visit: [documentation.oblix.ai](https://documentation.oblix.ai)\n\n## Features\n\n- **Unified Model Interface**: Consistent API across different model providers (OpenAI, Claude, Ollama)\n- **Intelligent Orchestration**: Automatic switching between local and cloud models based on system resources and connectivity\n- **Session Management**: Create, load, export, and import conversation sessions\n- **Streaming Support**: Real-time token streaming \n- **Extensible Agent System**: Custom monitoring and decision-making policies\n- **OpenAI-Compatible API**: Drop-in compatibility with applications built for OpenAI's API\n\n## Getting Started\n\n### Installation\n\n```bash\npip install oblix\n```\n\n### Basic Usage\n\n```python\nfrom oblix import OblixClient, ModelType\n\n# Initialize client\nclient = OblixClient()\n\n# Connect to models\nawait client.hook_model(ModelType.OLLAMA, \"llama2\", endpoint=\"http://localhost:11434\")\nawait client.hook_model(ModelType.OPENAI, \"gpt-3.5-turbo\", api_key=\"your_openai_key\")\n\n# Add monitoring agents\nfrom oblix.agents.resource_monitor import ResourceMonitor\nfrom oblix.agents.connectivity import ConnectivityAgent\nclient.hook_agent(ResourceMonitor())\nclient.hook_agent(ConnectivityAgent())\n\n# Execute prompts (with automatic model selection)\nresponse = await client.execute(\"Your prompt here\")\n\n# Stream responses\nawait client.execute_streaming(\"Your prompt here\")\n\n# Manage sessions\nsession_id = await client.create_session(\"Chat Title\")\n```\n\n## OpenAI-Compatible API\n\nOblix provides an OpenAI-compatible endpoint that allows using the standard OpenAI Python client with Oblix's orchestration capabilities:\n\n### Starting the Server\n\n```bash\n# Start the Oblix server\nuvicorn oblix.main:app --host 0.0.0.0 --port 8000\n```\n\n### Client Usage\n\n```python\nfrom openai import OpenAI\n\n# Point to local Oblix server\nclient = OpenAI(\n base_url=\"http://localhost:8000/v1\",\n api_key=\"any-value\" # Authentication not required\n)\n\n# Use just like the OpenAI client\ncompletion = client.chat.completions.create(\n model=\"ollama:llama2\", # Use format \"provider:model\" or just \"model_name\"\n messages=[\n {\"role\": \"system\", \"content\": \"You are a helpful AI assistant.\"},\n {\"role\": \"user\", \"content\": \"Hello world!\"}\n ],\n temperature=0.7,\n)\n\n# Access response\nprint(completion.choices[0].message.content)\n\n# Streaming also works\nstream = client.chat.completions.create(\n model=\"openai:gpt-3.5-turbo\", \n messages=[{\"role\": \"user\", \"content\": \"Write a poem\"}],\n stream=True,\n)\n\nfor chunk in stream:\n if chunk.choices[0].delta.content:\n print(chunk.choices[0].delta.content, end=\"\")\n```\n\n## Orchestration Features\n\nThe OpenAI-compatible endpoint supports:\n\n- Specifying models in \"provider:name\" format (e.g., \"openai:gpt-3.5-turbo\", \"ollama:llama2\")\n- Using a generic model identifier to let Oblix automatically select the best model\n- Streaming responses\n- Full context handling for conversations\n\n## Examples\n\nCheck the `examples/` directory for more usage examples:\n\n- `demo.py`: Basic usage of the Oblix client\n- `connectivity_demo.py`: Demonstrates automatic switching based on network conditions\n- `session_management_demo.py`: Shows session creation and management\n- `openai_compatibility_demo.py`: Shows how to use the OpenAI client with Oblix\n",
"bugtrack_url": null,
"license": null,
"summary": "AI Model Orchestration SDK",
"version": "0.1.2",
"project_urls": null,
"split_keywords": [
"ai",
"sdk",
"model",
"orchestration"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "c6cacf968a3738606fbfa3c510c6253e0d5b58ab01beb0f4123be590dab2e0bc",
"md5": "145fd42d05133072b7c68f1badaf5d96",
"sha256": "76ae11e2d0f594a4fd4772d18ba60003f5abd6dac3cc611032d0aea604256ba3"
},
"downloads": -1,
"filename": "oblix-0.1.2-cp310-cp310-macosx_15_0_arm64.whl",
"has_sig": false,
"md5_digest": "145fd42d05133072b7c68f1badaf5d96",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.8",
"size": 7619551,
"upload_time": "2025-04-09T02:17:56",
"upload_time_iso_8601": "2025-04-09T02:17:56.410162Z",
"url": "https://files.pythonhosted.org/packages/c6/ca/cf968a3738606fbfa3c510c6253e0d5b58ab01beb0f4123be590dab2e0bc/oblix-0.1.2-cp310-cp310-macosx_15_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7b22bcfea8311cc50ba2727498e6b5b1baf24de6563bd81ec7c0a9eadd12f375",
"md5": "dfc946131911740aff6e25a58039ceb8",
"sha256": "4b256d1cae373610fa8aee484c683b41097f40e776a1ba27c9095e4b0b45e264"
},
"downloads": -1,
"filename": "oblix-0.1.2-cp311-cp311-macosx_15_0_arm64.whl",
"has_sig": false,
"md5_digest": "dfc946131911740aff6e25a58039ceb8",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": ">=3.8",
"size": 7615563,
"upload_time": "2025-04-09T02:18:01",
"upload_time_iso_8601": "2025-04-09T02:18:01.295439Z",
"url": "https://files.pythonhosted.org/packages/7b/22/bcfea8311cc50ba2727498e6b5b1baf24de6563bd81ec7c0a9eadd12f375/oblix-0.1.2-cp311-cp311-macosx_15_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "6001e506211c099828c4c419e2bbd20a24db207261a56093fe6350d0cdf64eb6",
"md5": "1dcb537094a207a0c8fc235b345829a1",
"sha256": "60afc67da86c03d1ab3c1acae218d725769be6cbba9cc21dcac59f7c063b2e47"
},
"downloads": -1,
"filename": "oblix-0.1.2-cp312-cp312-macosx_15_0_arm64.whl",
"has_sig": false,
"md5_digest": "1dcb537094a207a0c8fc235b345829a1",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.8",
"size": 7591308,
"upload_time": "2025-04-09T02:18:06",
"upload_time_iso_8601": "2025-04-09T02:18:06.027969Z",
"url": "https://files.pythonhosted.org/packages/60/01/e506211c099828c4c419e2bbd20a24db207261a56093fe6350d0cdf64eb6/oblix-0.1.2-cp312-cp312-macosx_15_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "298900a0194f9a62588cfcd8c3d7c601897d4e6744abfe6e75c4a3086dad495c",
"md5": "fbcdaefc8d6c547d7bb6c883bffa138b",
"sha256": "500fba6db3fa4ff47213dd06135dba66bd1defca3d68ab0af3ed222d9b8b3aa3"
},
"downloads": -1,
"filename": "oblix-0.1.2-cp313-cp313-macosx_15_0_arm64.whl",
"has_sig": false,
"md5_digest": "fbcdaefc8d6c547d7bb6c883bffa138b",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": ">=3.8",
"size": 7553802,
"upload_time": "2025-04-09T02:18:10",
"upload_time_iso_8601": "2025-04-09T02:18:10.677035Z",
"url": "https://files.pythonhosted.org/packages/29/89/00a0194f9a62588cfcd8c3d7c601897d4e6744abfe6e75c4a3086dad495c/oblix-0.1.2-cp313-cp313-macosx_15_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "bf387720780ed76709fb1e828bd8c481d418aa3da624b9e8fe33267cdbe0cf01",
"md5": "060b755a5b7a5a6afb411713b4422fa6",
"sha256": "0bb784ef538f13578add260921c290cd6c9dca219f86f4ce635c353f9e51c03b"
},
"downloads": -1,
"filename": "oblix-0.1.2-cp39-cp39-macosx_15_0_arm64.whl",
"has_sig": false,
"md5_digest": "060b755a5b7a5a6afb411713b4422fa6",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.8",
"size": 7628756,
"upload_time": "2025-04-09T02:18:15",
"upload_time_iso_8601": "2025-04-09T02:18:15.592534Z",
"url": "https://files.pythonhosted.org/packages/bf/38/7720780ed76709fb1e828bd8c481d418aa3da624b9e8fe33267cdbe0cf01/oblix-0.1.2-cp39-cp39-macosx_15_0_arm64.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-04-09 02:17:56",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "oblix"
}