ai-tool-registry


Nameai-tool-registry JSON
Version 0.7.5 PyPI version JSON
download
home_pageNone
SummaryAdvanced tool registration system for Anthropic Claude integration with automatic schema generation and validation
upload_time2025-10-06 16:18:42
maintainerNone
docs_urlNone
authorNone
requires_python>=3.12
licenseMIT
keywords anthropic api claude schema tools validation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Universal Tool Registry Module

Advanced tool registration system for multiple AI providers with automatic schema generation, validation, and error handling. Supports **Anthropic Claude**, **OpenAI**, **Mistral AI**, **AWS Bedrock**, and **Google Gemini**.

## Features

- **Multi-provider support** - Works with all major AI providers
- **Automatic JSON schema generation** from function signatures
- **Pydantic model integration** and validation
- **ToolContext parameter filtering** - Automatic exclusion of context parameters with type safety
- **Legacy parameter filtering** for internal/context parameters
- **Unified interface** across different AI providers
- **Comprehensive error handling** and logging
- **Type safety** with full type hints
- **Optional dependencies** - Install only what you need

## Installation

### Basic Installation

```bash
# Using UV (recommended)
uv add ai-tool-registry

# Using pip
pip install ai-tool-registry
```

### Provider-Specific Installation

```bash
# For Anthropic Claude
uv add ai-tool-registry[anthropic]

# For OpenAI
uv add ai-tool-registry[openai]

# For Mistral AI 
uv add ai-tool-registry[mistral]

# For AWS Bedrock
uv add ai-tool-registry[bedrock]

# For Google Gemini
uv add ai-tool-registry[gemini]

# Install all providers
uv add ai-tool-registry[all]
```

## Quick Start

```python
from tool_registry_module import tool, build_registry_openai, build_registry_anthropic, ToolContext
from pydantic import BaseModel
from typing import Annotated


class UserData(BaseModel):
    name: str
    age: int


@tool(description="Process user information")
def process_user(
    input: UserData, 
    context: ToolContext[dict] = None  # Automatically excluded from schema
) -> UserData:
    # context parameter is available for use but won't appear in AI tool schema
    return input


# Build registries for different providers
openai_registry = build_registry_openai([process_user])
anthropic_registry = build_registry_anthropic([process_user])

# Use with respective APIs
openai_tools = [entry["representation"] for entry in openai_registry.values()]
anthropic_tools = [entry["representation"] for entry in anthropic_registry.values()]
```

## Multi-Provider Examples

### OpenAI Function Calling

```python
from tool_registry_module import tool, build_registry_openai
import openai

@tool(description="Get weather information")
def get_weather(location: str, unit: str = "celsius") -> str:
    return f"Weather in {location}: 22°{unit[0].upper()}"

# Build OpenAI registry
registry = build_registry_openai([get_weather])
tools = [entry["representation"] for entry in registry.values()]

# Use with OpenAI
client = openai.OpenAI()
response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "What's the weather in Paris?"}],
    tools=tools
)
```

### Anthropic Claude

```python
from tool_registry_module import tool, build_registry_anthropic
import anthropic

registry = build_registry_anthropic([get_weather])
tools = [entry["representation"] for entry in registry.values()]

# Use with Anthropic
client = anthropic.Anthropic()
response = client.messages.create(
    model="claude-3-sonnet-20240229",
    max_tokens=1000,
    messages=[{"role": "user", "content": "What's the weather in Paris?"}],
    tools=tools
)
```

### AWS Bedrock

```python
from tool_registry_module import tool, build_registry_bedrock
import boto3

registry = build_registry_bedrock([get_weather])
tools = [entry["representation"] for entry in registry.values()]

# Use with Bedrock
client = boto3.client("bedrock-runtime")
# Use tools in your Bedrock converse API calls
```

### Google Gemini

```python
from tool_registry_module import tool, build_registry_gemini
import google.generativeai as genai

registry = build_registry_gemini([get_weather])
tools = [entry["representation"] for entry in registry.values()]

# Use with Gemini
model = genai.GenerativeModel('gemini-pro')
# Use tools in your Gemini function calling
```

### Mistral AI

```python
from tool_registry_module import tool, build_registry_mistral
from mistralai.client import MistralClient

registry = build_registry_mistral([get_weather])
tools = [entry["representation"] for entry in registry.values()]

# Use with Mistral
client = MistralClient()
# Use tools in your Mistral function calling
```

## Advanced Usage

### Parameter Filtering

#### Using ToolContext (Recommended)

Use `ToolContext` to mark parameters that should be automatically excluded from schemas:

```python
from tool_registry_module import tool, ToolContext
from typing import Annotated

@tool(description="Process user data with context")
def process_data(
    user_input: str,
    context: ToolContext[dict],  # Direct ToolContext generic
    session: Annotated[str, ToolContext],  # Annotated ToolContext
    debug_flag: bool = False
) -> str:
    # context and session parameters are automatically excluded from the schema
    # but available for use in your function
    return f"Processed: {user_input}"
```

**ToolContext Features:**
- **Automatic exclusion** - No need to manually specify `ignore_in_schema`
- **Type safety** - Full type hints with generic support
- **Reference preservation** - Objects maintain their references for mutation within functions
- **Multiple forms** - Both direct (`ToolContext[T]`) and annotated (`Annotated[T, ToolContext]`) syntax
- **Union validation** - Prevents incorrect usage in union types

**Supported ToolContext patterns:**
```python
# Direct generic types
param1: ToolContext[dict]
param2: ToolContext[str] 

# Annotated types
param3: Annotated[str, ToolContext]
param4: Annotated[dict, ToolContext, "description"]

# Union types will raise TypeError (prevented for safety)
# param5: Union[str, ToolContext[dict]]  # ❌ Not allowed
```

**Reference Preservation Example:**
```python
from tool_registry_module import tool, ToolContext

# Context objects maintain their references for mutation
@tool(description="Track user interactions")
def track_interaction(
    action: str,
    user_context: ToolContext[dict]  # This dict can be modified
) -> str:
    # Modify the context object - changes persist outside function
    user_context["actions"] = user_context.get("actions", [])
    user_context["actions"].append(action)
    user_context["last_action"] = action
    return f"Tracked: {action}"

# Usage
context = {"user_id": "123"}
track_interaction("login", user_context=context)
track_interaction("view_profile", user_context=context)

# Context object is modified:
print(context)  
# {'user_id': '123', 'actions': ['login', 'view_profile'], 'last_action': 'view_profile'}
```

#### Legacy Parameter Filtering

You can still manually exclude parameters using `ignore_in_schema`:

```python
@tool(
    description="Calculate area with debug output",
    ignore_in_schema=["debug_mode", "context"]
)
def calculate_area(length: float, width: float, debug_mode: bool = False, context: str = "calc") -> float:
    if debug_mode:
        print(f"Calculating area for {length} x {width}")
    return length * width
```

### Cache Control (Anthropic)

Add cache control for better performance with Anthropic:

```python
@tool(
    description="Expensive computation",
    cache_control={"type": "ephemeral"}
)
def expensive_function(data: str) -> str:
    # Expensive computation here
    return processed_data
```

#### Registry Utilities

```python
from tool_registry_module import get_tool_info, validate_registry

# Get information about a specific tool
info = get_tool_info(registry, "process_user")
print(info["description"])

# Validate registry structure
is_valid = validate_registry(registry)
```

### Tool Use Handling

The registry is a dictionary that enables dynamic function calling for AI tool responses:

```python
from tool_registry_module import tool, build_registry_anthropic

@tool(description="Add two numbers")
def add_numbers(a: int, b: int) -> int:
    return a + b

@tool(description="Get weather info")
def get_weather(city: str, units: str = "celsius") -> str:
    return f"Weather in {city}: 22°{units[0].upper()}"

# Build registry
registry = build_registry_anthropic([add_numbers, get_weather])

# Handle tool use responses dynamically
def handle_tool_calls(tool_calls, registry):
    results = []
    for tool_call in tool_calls:
        tool_name = tool_call.name
        tool_args = tool_call.input
        
        if tool_name in registry:
            try:
                # Get function from registry and execute
                tool_func = registry[tool_name]["tool"]
                result = tool_func(**tool_args)
                results.append({
                    "tool_use_id": tool_call.id,
                    "content": str(result)
                })
            except Exception as e:
                results.append({
                    "tool_use_id": tool_call.id,
                    "error": f"Error: {e}"
                })
        else:
            results.append({
                "tool_use_id": tool_call.id,
                "error": f"Tool '{tool_name}' not found"
            })
    
    return results

# Registry structure: {tool_name: {"tool": callable, "representation": provider_format}}
# Use registry[tool_name]["tool"] for dynamic function calling
```

## Supported Providers

| Provider | Function | Format |
|----------|----------|---------|
| **Anthropic Claude** | `build_registry_anthropic()` | Claude ToolParam |
| **OpenAI** | `build_registry_openai()` | OpenAI Function Call |
| **Mistral AI** | `build_registry_mistral()` | Mistral Function Call |
| **AWS Bedrock** | `build_registry_bedrock()` | Bedrock ToolSpec |
| **Google Gemini** | `build_registry_gemini()` | Gemini FunctionDeclaration |

## Requirements

- **Python 3.12+**
- **pydantic >= 2.0.0** (required)

### Optional Provider Dependencies

- **anthropic >= 0.52.2** (for Anthropic Claude)
- **openai >= 1.0.0** (for OpenAI)
- **mistralai >= 0.4.0** (for Mistral AI)  
- **boto3 >= 1.34.0** (for AWS Bedrock)
- **google-generativeai >= 0.3.0** (for Google Gemini)

## Migration from v2.x

The old `build_registry_anthropic_tool_registry()` function is still available for backward compatibility but deprecated. Use `build_registry_anthropic()` instead.

## License

MIT License

## Development

### Setup

```bash
# Clone and install with dev dependencies
git clone https://github.com/kazmer97/ai-tool-registry.git
cd ai-tool-registry
uv sync --extra dev

# Run linting
uv run ruff check .
uv run ruff format .
```

### Testing

```bash
# Run tests (when available)
uv run pytest

# Type checking
uv run mypy tool_registry_module/
```

## Contributing

1. Fork the repository
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
3. Run linting: `uv run ruff check . && uv run ruff format .`
4. Commit changes (`git commit -m 'Add amazing feature'`)
5. Push to branch (`git push origin feature/amazing-feature`)
6. Open a Pull Request
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "ai-tool-registry",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.12",
    "maintainer_email": null,
    "keywords": "anthropic, api, claude, schema, tools, validation",
    "author": null,
    "author_email": "\"Kazmer, Nagy-Betegh\" <kazmer.nb@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/ab/7a/6f38c785e6c9ccf46782a4500545228c2fd954b91a7a0e162f7f516d448c/ai_tool_registry-0.7.5.tar.gz",
    "platform": null,
    "description": "# Universal Tool Registry Module\n\nAdvanced tool registration system for multiple AI providers with automatic schema generation, validation, and error handling. Supports **Anthropic Claude**, **OpenAI**, **Mistral AI**, **AWS Bedrock**, and **Google Gemini**.\n\n## Features\n\n- **Multi-provider support** - Works with all major AI providers\n- **Automatic JSON schema generation** from function signatures\n- **Pydantic model integration** and validation\n- **ToolContext parameter filtering** - Automatic exclusion of context parameters with type safety\n- **Legacy parameter filtering** for internal/context parameters\n- **Unified interface** across different AI providers\n- **Comprehensive error handling** and logging\n- **Type safety** with full type hints\n- **Optional dependencies** - Install only what you need\n\n## Installation\n\n### Basic Installation\n\n```bash\n# Using UV (recommended)\nuv add ai-tool-registry\n\n# Using pip\npip install ai-tool-registry\n```\n\n### Provider-Specific Installation\n\n```bash\n# For Anthropic Claude\nuv add ai-tool-registry[anthropic]\n\n# For OpenAI\nuv add ai-tool-registry[openai]\n\n# For Mistral AI \nuv add ai-tool-registry[mistral]\n\n# For AWS Bedrock\nuv add ai-tool-registry[bedrock]\n\n# For Google Gemini\nuv add ai-tool-registry[gemini]\n\n# Install all providers\nuv add ai-tool-registry[all]\n```\n\n## Quick Start\n\n```python\nfrom tool_registry_module import tool, build_registry_openai, build_registry_anthropic, ToolContext\nfrom pydantic import BaseModel\nfrom typing import Annotated\n\n\nclass UserData(BaseModel):\n    name: str\n    age: int\n\n\n@tool(description=\"Process user information\")\ndef process_user(\n    input: UserData, \n    context: ToolContext[dict] = None  # Automatically excluded from schema\n) -> UserData:\n    # context parameter is available for use but won't appear in AI tool schema\n    return input\n\n\n# Build registries for different providers\nopenai_registry = build_registry_openai([process_user])\nanthropic_registry = build_registry_anthropic([process_user])\n\n# Use with respective APIs\nopenai_tools = [entry[\"representation\"] for entry in openai_registry.values()]\nanthropic_tools = [entry[\"representation\"] for entry in anthropic_registry.values()]\n```\n\n## Multi-Provider Examples\n\n### OpenAI Function Calling\n\n```python\nfrom tool_registry_module import tool, build_registry_openai\nimport openai\n\n@tool(description=\"Get weather information\")\ndef get_weather(location: str, unit: str = \"celsius\") -> str:\n    return f\"Weather in {location}: 22\u00b0{unit[0].upper()}\"\n\n# Build OpenAI registry\nregistry = build_registry_openai([get_weather])\ntools = [entry[\"representation\"] for entry in registry.values()]\n\n# Use with OpenAI\nclient = openai.OpenAI()\nresponse = client.chat.completions.create(\n    model=\"gpt-4\",\n    messages=[{\"role\": \"user\", \"content\": \"What's the weather in Paris?\"}],\n    tools=tools\n)\n```\n\n### Anthropic Claude\n\n```python\nfrom tool_registry_module import tool, build_registry_anthropic\nimport anthropic\n\nregistry = build_registry_anthropic([get_weather])\ntools = [entry[\"representation\"] for entry in registry.values()]\n\n# Use with Anthropic\nclient = anthropic.Anthropic()\nresponse = client.messages.create(\n    model=\"claude-3-sonnet-20240229\",\n    max_tokens=1000,\n    messages=[{\"role\": \"user\", \"content\": \"What's the weather in Paris?\"}],\n    tools=tools\n)\n```\n\n### AWS Bedrock\n\n```python\nfrom tool_registry_module import tool, build_registry_bedrock\nimport boto3\n\nregistry = build_registry_bedrock([get_weather])\ntools = [entry[\"representation\"] for entry in registry.values()]\n\n# Use with Bedrock\nclient = boto3.client(\"bedrock-runtime\")\n# Use tools in your Bedrock converse API calls\n```\n\n### Google Gemini\n\n```python\nfrom tool_registry_module import tool, build_registry_gemini\nimport google.generativeai as genai\n\nregistry = build_registry_gemini([get_weather])\ntools = [entry[\"representation\"] for entry in registry.values()]\n\n# Use with Gemini\nmodel = genai.GenerativeModel('gemini-pro')\n# Use tools in your Gemini function calling\n```\n\n### Mistral AI\n\n```python\nfrom tool_registry_module import tool, build_registry_mistral\nfrom mistralai.client import MistralClient\n\nregistry = build_registry_mistral([get_weather])\ntools = [entry[\"representation\"] for entry in registry.values()]\n\n# Use with Mistral\nclient = MistralClient()\n# Use tools in your Mistral function calling\n```\n\n## Advanced Usage\n\n### Parameter Filtering\n\n#### Using ToolContext (Recommended)\n\nUse `ToolContext` to mark parameters that should be automatically excluded from schemas:\n\n```python\nfrom tool_registry_module import tool, ToolContext\nfrom typing import Annotated\n\n@tool(description=\"Process user data with context\")\ndef process_data(\n    user_input: str,\n    context: ToolContext[dict],  # Direct ToolContext generic\n    session: Annotated[str, ToolContext],  # Annotated ToolContext\n    debug_flag: bool = False\n) -> str:\n    # context and session parameters are automatically excluded from the schema\n    # but available for use in your function\n    return f\"Processed: {user_input}\"\n```\n\n**ToolContext Features:**\n- **Automatic exclusion** - No need to manually specify `ignore_in_schema`\n- **Type safety** - Full type hints with generic support\n- **Reference preservation** - Objects maintain their references for mutation within functions\n- **Multiple forms** - Both direct (`ToolContext[T]`) and annotated (`Annotated[T, ToolContext]`) syntax\n- **Union validation** - Prevents incorrect usage in union types\n\n**Supported ToolContext patterns:**\n```python\n# Direct generic types\nparam1: ToolContext[dict]\nparam2: ToolContext[str] \n\n# Annotated types\nparam3: Annotated[str, ToolContext]\nparam4: Annotated[dict, ToolContext, \"description\"]\n\n# Union types will raise TypeError (prevented for safety)\n# param5: Union[str, ToolContext[dict]]  # \u274c Not allowed\n```\n\n**Reference Preservation Example:**\n```python\nfrom tool_registry_module import tool, ToolContext\n\n# Context objects maintain their references for mutation\n@tool(description=\"Track user interactions\")\ndef track_interaction(\n    action: str,\n    user_context: ToolContext[dict]  # This dict can be modified\n) -> str:\n    # Modify the context object - changes persist outside function\n    user_context[\"actions\"] = user_context.get(\"actions\", [])\n    user_context[\"actions\"].append(action)\n    user_context[\"last_action\"] = action\n    return f\"Tracked: {action}\"\n\n# Usage\ncontext = {\"user_id\": \"123\"}\ntrack_interaction(\"login\", user_context=context)\ntrack_interaction(\"view_profile\", user_context=context)\n\n# Context object is modified:\nprint(context)  \n# {'user_id': '123', 'actions': ['login', 'view_profile'], 'last_action': 'view_profile'}\n```\n\n#### Legacy Parameter Filtering\n\nYou can still manually exclude parameters using `ignore_in_schema`:\n\n```python\n@tool(\n    description=\"Calculate area with debug output\",\n    ignore_in_schema=[\"debug_mode\", \"context\"]\n)\ndef calculate_area(length: float, width: float, debug_mode: bool = False, context: str = \"calc\") -> float:\n    if debug_mode:\n        print(f\"Calculating area for {length} x {width}\")\n    return length * width\n```\n\n### Cache Control (Anthropic)\n\nAdd cache control for better performance with Anthropic:\n\n```python\n@tool(\n    description=\"Expensive computation\",\n    cache_control={\"type\": \"ephemeral\"}\n)\ndef expensive_function(data: str) -> str:\n    # Expensive computation here\n    return processed_data\n```\n\n#### Registry Utilities\n\n```python\nfrom tool_registry_module import get_tool_info, validate_registry\n\n# Get information about a specific tool\ninfo = get_tool_info(registry, \"process_user\")\nprint(info[\"description\"])\n\n# Validate registry structure\nis_valid = validate_registry(registry)\n```\n\n### Tool Use Handling\n\nThe registry is a dictionary that enables dynamic function calling for AI tool responses:\n\n```python\nfrom tool_registry_module import tool, build_registry_anthropic\n\n@tool(description=\"Add two numbers\")\ndef add_numbers(a: int, b: int) -> int:\n    return a + b\n\n@tool(description=\"Get weather info\")\ndef get_weather(city: str, units: str = \"celsius\") -> str:\n    return f\"Weather in {city}: 22\u00b0{units[0].upper()}\"\n\n# Build registry\nregistry = build_registry_anthropic([add_numbers, get_weather])\n\n# Handle tool use responses dynamically\ndef handle_tool_calls(tool_calls, registry):\n    results = []\n    for tool_call in tool_calls:\n        tool_name = tool_call.name\n        tool_args = tool_call.input\n        \n        if tool_name in registry:\n            try:\n                # Get function from registry and execute\n                tool_func = registry[tool_name][\"tool\"]\n                result = tool_func(**tool_args)\n                results.append({\n                    \"tool_use_id\": tool_call.id,\n                    \"content\": str(result)\n                })\n            except Exception as e:\n                results.append({\n                    \"tool_use_id\": tool_call.id,\n                    \"error\": f\"Error: {e}\"\n                })\n        else:\n            results.append({\n                \"tool_use_id\": tool_call.id,\n                \"error\": f\"Tool '{tool_name}' not found\"\n            })\n    \n    return results\n\n# Registry structure: {tool_name: {\"tool\": callable, \"representation\": provider_format}}\n# Use registry[tool_name][\"tool\"] for dynamic function calling\n```\n\n## Supported Providers\n\n| Provider | Function | Format |\n|----------|----------|---------|\n| **Anthropic Claude** | `build_registry_anthropic()` | Claude ToolParam |\n| **OpenAI** | `build_registry_openai()` | OpenAI Function Call |\n| **Mistral AI** | `build_registry_mistral()` | Mistral Function Call |\n| **AWS Bedrock** | `build_registry_bedrock()` | Bedrock ToolSpec |\n| **Google Gemini** | `build_registry_gemini()` | Gemini FunctionDeclaration |\n\n## Requirements\n\n- **Python 3.12+**\n- **pydantic >= 2.0.0** (required)\n\n### Optional Provider Dependencies\n\n- **anthropic >= 0.52.2** (for Anthropic Claude)\n- **openai >= 1.0.0** (for OpenAI)\n- **mistralai >= 0.4.0** (for Mistral AI)  \n- **boto3 >= 1.34.0** (for AWS Bedrock)\n- **google-generativeai >= 0.3.0** (for Google Gemini)\n\n## Migration from v2.x\n\nThe old `build_registry_anthropic_tool_registry()` function is still available for backward compatibility but deprecated. Use `build_registry_anthropic()` instead.\n\n## License\n\nMIT License\n\n## Development\n\n### Setup\n\n```bash\n# Clone and install with dev dependencies\ngit clone https://github.com/kazmer97/ai-tool-registry.git\ncd ai-tool-registry\nuv sync --extra dev\n\n# Run linting\nuv run ruff check .\nuv run ruff format .\n```\n\n### Testing\n\n```bash\n# Run tests (when available)\nuv run pytest\n\n# Type checking\nuv run mypy tool_registry_module/\n```\n\n## Contributing\n\n1. Fork the repository\n2. Create a feature branch (`git checkout -b feature/amazing-feature`)\n3. Run linting: `uv run ruff check . && uv run ruff format .`\n4. Commit changes (`git commit -m 'Add amazing feature'`)\n5. Push to branch (`git push origin feature/amazing-feature`)\n6. Open a Pull Request",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Advanced tool registration system for Anthropic Claude integration with automatic schema generation and validation",
    "version": "0.7.5",
    "project_urls": {
        "Bug Tracker": "https://github.com/kazmer97/ai-tool-registry/issues",
        "Documentation": "https://github.com/kazmer97/ai-tool-registry#readme",
        "Homepage": "https://github.com/kazmer97/ai-tool-registry",
        "Repository": "https://github.com/kazmer97/ai-tool-registry"
    },
    "split_keywords": [
        "anthropic",
        " api",
        " claude",
        " schema",
        " tools",
        " validation"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a6288c48ffccbd5f2106032cc4592200dd95fdba20c29801697d8d59218fb3d5",
                "md5": "2173fe5c60cf703a65f1599e794319d4",
                "sha256": "57a375f7f73a4e4cb4080d06a5ca7bf908d43d5c8fdb9d30ca6e594f738f9572"
            },
            "downloads": -1,
            "filename": "ai_tool_registry-0.7.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2173fe5c60cf703a65f1599e794319d4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.12",
            "size": 16759,
            "upload_time": "2025-10-06T16:18:41",
            "upload_time_iso_8601": "2025-10-06T16:18:41.891940Z",
            "url": "https://files.pythonhosted.org/packages/a6/28/8c48ffccbd5f2106032cc4592200dd95fdba20c29801697d8d59218fb3d5/ai_tool_registry-0.7.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ab7a6f38c785e6c9ccf46782a4500545228c2fd954b91a7a0e162f7f516d448c",
                "md5": "7beb9cb39d2a2c03315a1b923c654f4f",
                "sha256": "db76e03ba53558ceaacfeb8ba56aae63443838ad581852ccff4d90db093a2672"
            },
            "downloads": -1,
            "filename": "ai_tool_registry-0.7.5.tar.gz",
            "has_sig": false,
            "md5_digest": "7beb9cb39d2a2c03315a1b923c654f4f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.12",
            "size": 70302,
            "upload_time": "2025-10-06T16:18:42",
            "upload_time_iso_8601": "2025-10-06T16:18:42.981636Z",
            "url": "https://files.pythonhosted.org/packages/ab/7a/6f38c785e6c9ccf46782a4500545228c2fd954b91a7a0e162f7f516d448c/ai_tool_registry-0.7.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-06 16:18:42",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kazmer97",
    "github_project": "ai-tool-registry",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "ai-tool-registry"
}
        
Elapsed time: 1.60862s