metorial-google


Namemetorial-google JSON
Version 1.0.4 PyPI version JSON
download
home_pageNone
SummaryGoogle (Gemini) provider for Metorial
upload_time2025-10-30 05:03:23
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseMIT
keywords ai gemini google llm metorial
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # metorial-google

Google (Gemini) provider integration for Metorial.

## Installation

```bash
pip install metorial-google
# or
uv add metorial-google
# or
poetry add metorial-google
```

## Features

- šŸ¤– **Gemini Integration**: Full support for Gemini Pro, Gemini Flash, and other Google AI models
- šŸ“” **Session Management**: Automatic tool lifecycle handling
- šŸ”„ **Format Conversion**: Converts Metorial tools to Google function declaration format
- ⚔ **Async Support**: Full async/await support

## Supported Models

All Google Gemini models that support function calling:

- `gemini-1.5-pro`: Most capable Gemini model with 2M context window
- `gemini-1.5-flash`: Fast and efficient Gemini model  
- `gemini-pro`: Standard Gemini Pro model
- `gemini-pro-vision`: Gemini Pro with vision capabilities

## Usage

### Quick Start (Recommended)

```python
import asyncio
import google.generativeai as genai
from metorial import Metorial

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...") # async by default
  genai.configure(api_key="...your-google-api-key...")
  google_client = genai.GenerativeModel('gemini-pro')
  
  # One-liner chat with automatic session management
  response = await metorial.run(
    "What are the latest commits in the metorial/websocket-explorer repository?",
    "...your-mcp-server-deployment-id...", # can also be list
    google_client,
    model="gemini-pro",
    max_iterations=25
  )
  
  print("Response:", response)

asyncio.run(main())
```

### Streaming Chat

```python
import asyncio
import google.generativeai as genai
from metorial import Metorial
from metorial.types import StreamEventType

async def streaming_example():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  genai.configure(api_key="...your-google-api-key...")
  google_client = genai.GenerativeModel('gemini-pro')
  
  # Streaming chat with real-time responses
  async def stream_action(session):
    messages = [
      {"role": "user", "content": "Explain quantum computing"}
    ]
    
    async for event in metorial.stream(
      google_client, session, messages, 
      model="gemini-pro",
      max_iterations=25
    ):
      if event.type == StreamEventType.CONTENT:
        print(f"šŸ¤– {event.content}", end="", flush=True)
      elif event.type == StreamEventType.TOOL_CALL:
        print(f"\nšŸ”§ Executing {len(event.tool_calls)} tool(s)...")
      elif event.type == StreamEventType.COMPLETE:
        print(f"\nāœ… Complete!")
  
  await metorial.with_session("...your-server-deployment-id...", stream_action)

asyncio.run(streaming_example())
```

### Advanced Usage with Session Management

```python
import asyncio
import google.generativeai as genai
from metorial import Metorial
from metorial_google import MetorialGoogleSession

async def main():
  # Initialize clients
  metorial = Metorial(api_key="...your-metorial-api-key...")
  genai.configure(api_key="...your-google-api-key...")
  
  # Create session with your server deployments
  async with metorial.session(["...your-server-deployment-id..."]) as session:
    # Create Google-specific wrapper
    google_session = MetorialGoogleSession(session.tool_manager)
    
    model = genai.GenerativeModel(
      model_name="gemini-pro",
      tools=google_session.tools
    )
    
    response = model.generate_content("What are the latest commits?")
    
    # Handle function calls if present
    if response.candidates[0].content.parts:
      function_calls = [
        part.function_call for part in response.candidates[0].content.parts
        if hasattr(part, 'function_call') and part.function_call
      ]
      
      if function_calls:
        tool_response = await google_session.call_tools(function_calls)
        # Continue conversation with tool_response

asyncio.run(main())
```

### Using Convenience Functions

```python
from metorial_google import build_google_tools, call_google_tools

async def example_with_functions():
  # Get tools in Google format
  tools = build_google_tools(tool_manager)
  
  # Call tools from Google response
  response = await call_google_tools(tool_manager, function_calls)
```

## API Reference

### `MetorialGoogleSession`

Main session class for Google integration.

```python
session = MetorialGoogleSession(tool_manager)
```

**Properties:**
- `tools`: List of tools in Google function declaration format

**Methods:**
- `async call_tools(function_calls)`: Execute function calls and return user content

### `build_google_tools(tool_mgr)`

Build Google-compatible tool definitions.

**Returns:** List of tool definitions in Google format

### `call_google_tools(tool_mgr, function_calls)`

Execute function calls from Google response.

**Returns:** User content with function responses

## Tool Format

Tools are converted to Google's function declaration format:

```python
[{
  "function_declarations": [
    {
      "name": "tool_name",
      "description": "Tool description",
      "parameters": {
        "type": "object",
        "properties": {...},
        "required": [...]
      }
    }
  ]
}]
```

## Error Handling

```python
try:
    response = await google_session.call_tools(function_calls)
except Exception as e:
    print(f"Tool execution failed: {e}")
```

Tool errors are returned as error objects in the response format.

## License

MIT License - see [LICENSE](../../LICENSE) file for details.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "metorial-google",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "ai, gemini, google, llm, metorial",
    "author": null,
    "author_email": "Metorial Team <support@metorial.com>",
    "download_url": "https://files.pythonhosted.org/packages/ff/66/e0b71774ea78cb4b937b508bbc0f0d90a3618b54de4f924c8f95d0f3291f/metorial_google-1.0.4.tar.gz",
    "platform": null,
    "description": "# metorial-google\n\nGoogle (Gemini) provider integration for Metorial.\n\n## Installation\n\n```bash\npip install metorial-google\n# or\nuv add metorial-google\n# or\npoetry add metorial-google\n```\n\n## Features\n\n- \ud83e\udd16 **Gemini Integration**: Full support for Gemini Pro, Gemini Flash, and other Google AI models\n- \ud83d\udce1 **Session Management**: Automatic tool lifecycle handling\n- \ud83d\udd04 **Format Conversion**: Converts Metorial tools to Google function declaration format\n- \u26a1 **Async Support**: Full async/await support\n\n## Supported Models\n\nAll Google Gemini models that support function calling:\n\n- `gemini-1.5-pro`: Most capable Gemini model with 2M context window\n- `gemini-1.5-flash`: Fast and efficient Gemini model  \n- `gemini-pro`: Standard Gemini Pro model\n- `gemini-pro-vision`: Gemini Pro with vision capabilities\n\n## Usage\n\n### Quick Start (Recommended)\n\n```python\nimport asyncio\nimport google.generativeai as genai\nfrom metorial import Metorial\n\nasync def main():\n  # Initialize clients\n  metorial = Metorial(api_key=\"...your-metorial-api-key...\") # async by default\n  genai.configure(api_key=\"...your-google-api-key...\")\n  google_client = genai.GenerativeModel('gemini-pro')\n  \n  # One-liner chat with automatic session management\n  response = await metorial.run(\n    \"What are the latest commits in the metorial/websocket-explorer repository?\",\n    \"...your-mcp-server-deployment-id...\", # can also be list\n    google_client,\n    model=\"gemini-pro\",\n    max_iterations=25\n  )\n  \n  print(\"Response:\", response)\n\nasyncio.run(main())\n```\n\n### Streaming Chat\n\n```python\nimport asyncio\nimport google.generativeai as genai\nfrom metorial import Metorial\nfrom metorial.types import StreamEventType\n\nasync def streaming_example():\n  # Initialize clients\n  metorial = Metorial(api_key=\"...your-metorial-api-key...\")\n  genai.configure(api_key=\"...your-google-api-key...\")\n  google_client = genai.GenerativeModel('gemini-pro')\n  \n  # Streaming chat with real-time responses\n  async def stream_action(session):\n    messages = [\n      {\"role\": \"user\", \"content\": \"Explain quantum computing\"}\n    ]\n    \n    async for event in metorial.stream(\n      google_client, session, messages, \n      model=\"gemini-pro\",\n      max_iterations=25\n    ):\n      if event.type == StreamEventType.CONTENT:\n        print(f\"\ud83e\udd16 {event.content}\", end=\"\", flush=True)\n      elif event.type == StreamEventType.TOOL_CALL:\n        print(f\"\\n\ud83d\udd27 Executing {len(event.tool_calls)} tool(s)...\")\n      elif event.type == StreamEventType.COMPLETE:\n        print(f\"\\n\u2705 Complete!\")\n  \n  await metorial.with_session(\"...your-server-deployment-id...\", stream_action)\n\nasyncio.run(streaming_example())\n```\n\n### Advanced Usage with Session Management\n\n```python\nimport asyncio\nimport google.generativeai as genai\nfrom metorial import Metorial\nfrom metorial_google import MetorialGoogleSession\n\nasync def main():\n  # Initialize clients\n  metorial = Metorial(api_key=\"...your-metorial-api-key...\")\n  genai.configure(api_key=\"...your-google-api-key...\")\n  \n  # Create session with your server deployments\n  async with metorial.session([\"...your-server-deployment-id...\"]) as session:\n    # Create Google-specific wrapper\n    google_session = MetorialGoogleSession(session.tool_manager)\n    \n    model = genai.GenerativeModel(\n      model_name=\"gemini-pro\",\n      tools=google_session.tools\n    )\n    \n    response = model.generate_content(\"What are the latest commits?\")\n    \n    # Handle function calls if present\n    if response.candidates[0].content.parts:\n      function_calls = [\n        part.function_call for part in response.candidates[0].content.parts\n        if hasattr(part, 'function_call') and part.function_call\n      ]\n      \n      if function_calls:\n        tool_response = await google_session.call_tools(function_calls)\n        # Continue conversation with tool_response\n\nasyncio.run(main())\n```\n\n### Using Convenience Functions\n\n```python\nfrom metorial_google import build_google_tools, call_google_tools\n\nasync def example_with_functions():\n  # Get tools in Google format\n  tools = build_google_tools(tool_manager)\n  \n  # Call tools from Google response\n  response = await call_google_tools(tool_manager, function_calls)\n```\n\n## API Reference\n\n### `MetorialGoogleSession`\n\nMain session class for Google integration.\n\n```python\nsession = MetorialGoogleSession(tool_manager)\n```\n\n**Properties:**\n- `tools`: List of tools in Google function declaration format\n\n**Methods:**\n- `async call_tools(function_calls)`: Execute function calls and return user content\n\n### `build_google_tools(tool_mgr)`\n\nBuild Google-compatible tool definitions.\n\n**Returns:** List of tool definitions in Google format\n\n### `call_google_tools(tool_mgr, function_calls)`\n\nExecute function calls from Google response.\n\n**Returns:** User content with function responses\n\n## Tool Format\n\nTools are converted to Google's function declaration format:\n\n```python\n[{\n  \"function_declarations\": [\n    {\n      \"name\": \"tool_name\",\n      \"description\": \"Tool description\",\n      \"parameters\": {\n        \"type\": \"object\",\n        \"properties\": {...},\n        \"required\": [...]\n      }\n    }\n  ]\n}]\n```\n\n## Error Handling\n\n```python\ntry:\n    response = await google_session.call_tools(function_calls)\nexcept Exception as e:\n    print(f\"Tool execution failed: {e}\")\n```\n\nTool errors are returned as error objects in the response format.\n\n## License\n\nMIT License - see [LICENSE](../../LICENSE) file for details.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Google (Gemini) provider for Metorial",
    "version": "1.0.4",
    "project_urls": {
        "Documentation": "https://metorial.com/docs",
        "Homepage": "https://metorial.com",
        "Repository": "https://github.com/metorial/metorial-python"
    },
    "split_keywords": [
        "ai",
        " gemini",
        " google",
        " llm",
        " metorial"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "e5a35d584697e2773af61bdbecb6fd83866ae3ebf8cc50a034c088bee33f4dd8",
                "md5": "691cad01eaa6296e41a9b8411671a1fc",
                "sha256": "13061ceebbe78bd514a0e55504648b1e84f9aa3402c9f7f69aade48bdd1148b1"
            },
            "downloads": -1,
            "filename": "metorial_google-1.0.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "691cad01eaa6296e41a9b8411671a1fc",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 5574,
            "upload_time": "2025-10-30T05:03:14",
            "upload_time_iso_8601": "2025-10-30T05:03:14.629806Z",
            "url": "https://files.pythonhosted.org/packages/e5/a3/5d584697e2773af61bdbecb6fd83866ae3ebf8cc50a034c088bee33f4dd8/metorial_google-1.0.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ff66e0b71774ea78cb4b937b508bbc0f0d90a3618b54de4f924c8f95d0f3291f",
                "md5": "4f587c2ce2fab2b89f3f5d7c7621b33e",
                "sha256": "d5319bb9dc32ba2777482262b61e6a546b4312e6c5b955f8b8bc9cead5ee5648"
            },
            "downloads": -1,
            "filename": "metorial_google-1.0.4.tar.gz",
            "has_sig": false,
            "md5_digest": "4f587c2ce2fab2b89f3f5d7c7621b33e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 6586,
            "upload_time": "2025-10-30T05:03:23",
            "upload_time_iso_8601": "2025-10-30T05:03:23.710251Z",
            "url": "https://files.pythonhosted.org/packages/ff/66/e0b71774ea78cb4b937b508bbc0f0d90a3618b54de4f924c8f95d0f3291f/metorial_google-1.0.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-30 05:03:23",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "metorial",
    "github_project": "metorial-python",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "metorial-google"
}
        
Elapsed time: 4.95154s