# metorial-togetherai
Together AI provider integration for Metorial - enables using Metorial tools with Together AI's language models through OpenAI-compatible function calling.
## Installation
```bash
pip install metorial-togetherai
# or
uv add metorial-togetherai
# or
poetry add metorial-togetherai
```
## Features
- 🤖 **Together AI Integration**: Full support for Llama, Mixtral, and other Together AI models
- 🛠️ **Function Calling**: OpenAI-compatible function calling support
- 📡 **Session Management**: Automatic tool lifecycle handling
- 🔄 **Format Conversion**: Converts Metorial tools to OpenAI function format
- ⚡ **Async Support**: Full async/await support
## Usage
### Basic Usage
```python
import asyncio
from openai import OpenAI
from metorial import Metorial
from metorial_togetherai import MetorialTogetherAISession
async def main():
# Initialize clients
metorial = Metorial(api_key="your-metorial-api-key")
# Together AI uses OpenAI-compatible client
together_client = OpenAI(
api_key="your-together-api-key",
base_url="https://api.together.xyz/v1"
)
# Create session with your server deployments
async with metorial.session(["your-server-deployment-id"]) as session:
# Create Together AI-specific wrapper
together_session = MetorialTogetherAISession(session.tool_manager)
messages = [
{"role": "user", "content": "Help me with this task"}
]
response = together_client.chat.completions.create(
model="meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo",
messages=messages,
tools=together_session.tools
)
# Handle tool calls
tool_calls = response.choices[0].message.tool_calls
if tool_calls:
tool_responses = await together_session.call_tools(tool_calls)
# Add to conversation
messages.append({
"role": "assistant",
"tool_calls": tool_calls
})
messages.extend(tool_responses)
# Continue conversation...
asyncio.run(main())
```
### Using Convenience Functions
```python
from metorial_togetherai import build_togetherai_tools, call_togetherai_tools
async def example_with_functions():
# Get tools in Together AI format
tools = build_togetherai_tools(tool_manager)
# Call tools from Together AI response
tool_messages = await call_togetherai_tools(tool_manager, tool_calls)
```
## API Reference
### `MetorialTogetherAISession`
Main session class for Together AI integration.
```python
session = MetorialTogetherAISession(tool_manager)
```
**Properties:**
- `tools`: List of tools in OpenAI-compatible format
**Methods:**
- `async call_tools(tool_calls)`: Execute tool calls and return tool messages
### `build_togetherai_tools(tool_mgr)`
Build Together AI-compatible tool definitions.
**Returns:** List of tool definitions in OpenAI format
### `call_togetherai_tools(tool_mgr, tool_calls)`
Execute tool calls from Together AI response.
**Returns:** List of tool messages
## Tool Format
Tools are converted to OpenAI-compatible format (without strict mode):
```python
{
"type": "function",
"function": {
"name": "tool_name",
"description": "Tool description",
"parameters": {
"type": "object",
"properties": {...},
"required": [...]
}
}
}
```
## Together AI API Configuration
Together AI uses the OpenAI-compatible API format. Configure your client like this:
```python
from openai import OpenAI
client = OpenAI(
api_key="your-together-api-key",
base_url="https://api.together.xyz/v1"
)
```
## Supported Models
Popular models available through Together AI:
- `meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo`: Llama 3.1 70B
- `meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo`: Llama 3.1 8B
- `mistralai/Mixtral-8x7B-Instruct-v0.1`: Mixtral 8x7B
- `NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO`: Nous Hermes 2
- And many more...
## Error Handling
```python
try:
tool_messages = await together_session.call_tools(tool_calls)
except Exception as e:
print(f"Tool execution failed: {e}")
```
Tool errors are returned as tool messages with error content.
## Dependencies
- `metorial-openai-compatible>=1.0.0`
- `metorial-mcp-session>=1.0.0`
- `typing-extensions>=4.0.0`
## License
MIT License - see [LICENSE](../../LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "metorial-togetherai",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "ai, llm, metorial, together",
"author": null,
"author_email": "Metorial Team <support@metorial.com>",
"download_url": "https://files.pythonhosted.org/packages/e8/69/4775cceb4e2c1cf7c18c8334a59a764e0403807cfc7fc1b5104565749cf7/metorial_togetherai-1.0.0rc2.tar.gz",
"platform": null,
"description": "# metorial-togetherai\n\nTogether AI provider integration for Metorial - enables using Metorial tools with Together AI's language models through OpenAI-compatible function calling.\n\n## Installation\n\n```bash\npip install metorial-togetherai\n# or\nuv add metorial-togetherai\n# or\npoetry add metorial-togetherai\n```\n\n## Features\n\n- \ud83e\udd16 **Together AI Integration**: Full support for Llama, Mixtral, and other Together AI models\n- \ud83d\udee0\ufe0f **Function Calling**: OpenAI-compatible function calling support\n- \ud83d\udce1 **Session Management**: Automatic tool lifecycle handling\n- \ud83d\udd04 **Format Conversion**: Converts Metorial tools to OpenAI function format\n- \u26a1 **Async Support**: Full async/await support\n\n## Usage\n\n### Basic Usage\n\n```python\nimport asyncio\nfrom openai import OpenAI\nfrom metorial import Metorial\nfrom metorial_togetherai import MetorialTogetherAISession\n\nasync def main():\n # Initialize clients\n metorial = Metorial(api_key=\"your-metorial-api-key\")\n \n # Together AI uses OpenAI-compatible client\n together_client = OpenAI(\n api_key=\"your-together-api-key\",\n base_url=\"https://api.together.xyz/v1\"\n )\n \n # Create session with your server deployments\n async with metorial.session([\"your-server-deployment-id\"]) as session:\n # Create Together AI-specific wrapper\n together_session = MetorialTogetherAISession(session.tool_manager)\n \n messages = [\n {\"role\": \"user\", \"content\": \"Help me with this task\"}\n ]\n \n response = together_client.chat.completions.create(\n model=\"meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo\",\n messages=messages,\n tools=together_session.tools\n )\n \n # Handle tool calls\n tool_calls = response.choices[0].message.tool_calls\n if tool_calls:\n tool_responses = await together_session.call_tools(tool_calls)\n \n # Add to conversation\n messages.append({\n \"role\": \"assistant\",\n \"tool_calls\": tool_calls\n })\n messages.extend(tool_responses)\n \n # Continue conversation...\n\nasyncio.run(main())\n```\n\n### Using Convenience Functions\n\n```python\nfrom metorial_togetherai import build_togetherai_tools, call_togetherai_tools\n\nasync def example_with_functions():\n # Get tools in Together AI format\n tools = build_togetherai_tools(tool_manager)\n \n # Call tools from Together AI response\n tool_messages = await call_togetherai_tools(tool_manager, tool_calls)\n```\n\n## API Reference\n\n### `MetorialTogetherAISession`\n\nMain session class for Together AI integration.\n\n```python\nsession = MetorialTogetherAISession(tool_manager)\n```\n\n**Properties:**\n- `tools`: List of tools in OpenAI-compatible format\n\n**Methods:**\n- `async call_tools(tool_calls)`: Execute tool calls and return tool messages\n\n### `build_togetherai_tools(tool_mgr)`\n\nBuild Together AI-compatible tool definitions.\n\n**Returns:** List of tool definitions in OpenAI format\n\n### `call_togetherai_tools(tool_mgr, tool_calls)`\n\nExecute tool calls from Together AI response.\n\n**Returns:** List of tool messages\n\n## Tool Format\n\nTools are converted to OpenAI-compatible format (without strict mode):\n\n```python\n{\n \"type\": \"function\",\n \"function\": {\n \"name\": \"tool_name\",\n \"description\": \"Tool description\",\n \"parameters\": {\n \"type\": \"object\",\n \"properties\": {...},\n \"required\": [...]\n }\n }\n}\n```\n\n## Together AI API Configuration\n\nTogether AI uses the OpenAI-compatible API format. Configure your client like this:\n\n```python\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=\"your-together-api-key\",\n base_url=\"https://api.together.xyz/v1\"\n)\n```\n\n## Supported Models\n\nPopular models available through Together AI:\n\n- `meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo`: Llama 3.1 70B\n- `meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo`: Llama 3.1 8B \n- `mistralai/Mixtral-8x7B-Instruct-v0.1`: Mixtral 8x7B\n- `NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO`: Nous Hermes 2\n- And many more...\n\n## Error Handling\n\n```python\ntry:\n tool_messages = await together_session.call_tools(tool_calls)\nexcept Exception as e:\n print(f\"Tool execution failed: {e}\")\n```\n\nTool errors are returned as tool messages with error content.\n\n## Dependencies\n\n- `metorial-openai-compatible>=1.0.0`\n- `metorial-mcp-session>=1.0.0`\n- `typing-extensions>=4.0.0`\n\n## License\n\nMIT License - see [LICENSE](../../LICENSE) file for details.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Together AI provider for Metorial",
"version": "1.0.0rc2",
"project_urls": {
"Documentation": "https://metorial.com/docs",
"Homepage": "https://metorial.com",
"Repository": "https://github.com/metorial/metorial-enterprise"
},
"split_keywords": [
"ai",
" llm",
" metorial",
" together"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "ca19d41361dbaf749d5c48e18db80290331c60b4d49abe7744f29abfb80409d2",
"md5": "d4e025016098196fabdc9c858932bc86",
"sha256": "3b920ae2144f656a29ae747254fd46334f846f4f579767725763dc2adbd7091b"
},
"downloads": -1,
"filename": "metorial_togetherai-1.0.0rc2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d4e025016098196fabdc9c858932bc86",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 4511,
"upload_time": "2025-07-26T12:17:40",
"upload_time_iso_8601": "2025-07-26T12:17:40.574023Z",
"url": "https://files.pythonhosted.org/packages/ca/19/d41361dbaf749d5c48e18db80290331c60b4d49abe7744f29abfb80409d2/metorial_togetherai-1.0.0rc2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "e8694775cceb4e2c1cf7c18c8334a59a764e0403807cfc7fc1b5104565749cf7",
"md5": "2b0fa964442adcca11de5c1ec858b6e8",
"sha256": "a59ae75020163ee0778f99e904b8c3fec71fd2b5556576cb4d8a5048f37118b9"
},
"downloads": -1,
"filename": "metorial_togetherai-1.0.0rc2.tar.gz",
"has_sig": false,
"md5_digest": "2b0fa964442adcca11de5c1ec858b6e8",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 5517,
"upload_time": "2025-07-26T12:17:45",
"upload_time_iso_8601": "2025-07-26T12:17:45.625958Z",
"url": "https://files.pythonhosted.org/packages/e8/69/4775cceb4e2c1cf7c18c8334a59a764e0403807cfc7fc1b5104565749cf7/metorial_togetherai-1.0.0rc2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-26 12:17:45",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "metorial",
"github_project": "metorial-enterprise",
"github_not_found": true,
"lcname": "metorial-togetherai"
}