arato-proxy


Namearato-proxy JSON
Version 0.3.1 PyPI version JSON
download
home_pageNone
SummaryLiteLLM proxy for Arato logs
upload_time2025-07-16 07:16:53
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseMIT
keywords arato litellm llm logging openai proxy
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LiteLLM based proxy for Arato Logs

Use this [LiteLLM](https://www.litellm.ai/) proxy to send logs data to an Arato monitoring endpoint.
All LLM calls going through the proxy will be logged in Arato.

## Quick Start

### Using uvx (Recommended)

Since this package is not yet published to PyPI, you can use uvx with the local wheel:

```bash
# First, build the package
uv build

# Run directly without installation using the local wheel
uvx --from ./dist/arato_proxy-*.whl arato-proxy

# Or with custom configuration
uvx --from ./dist/arato_proxy-*.whl arato-proxy --config custom.yaml --port 8080
```

### Using uv/pip (Development Install)

```bash
# Install from source in development mode
uv pip install -e .

# Or install from the built wheel
uv pip install ./dist/arato_proxy-*.whl

# Then run
arato-proxy
```

### Publishing to PyPI (For Maintainers)

To publish a new version to PyPI, you have two options:

#### Option 1: GitHub Release (Recommended)

1. Create a GitHub release with a tag like `v0.3.0`
2. This automatically triggers the release workflow

#### Option 2: Manual Workflow Dispatch

1. Go to Actions → "Build and Publish to PyPI" → "Run workflow"
2. Choose version bump type:
   - `patch`: 0.2.0 → 0.2.1
   - `minor`: 0.2.0 → 0.3.0
   - `major`: 0.2.0 → 1.0.0
   - Or specify exact version like `0.5.0`

The GitHub Actions workflow will:

1. Extract/calculate the version from the git tag or bump type
2. Update the version in `pyproject.toml`
3. Create a new tag (for manual dispatch)
4. Build the package
5. Publish to PyPI using trusted publishing

Once published to PyPI, users will be able to run:

```bash
# After publishing to PyPI
uvx arato-proxy
pip install arato-proxy
```

### GitHub Actions

This repository includes automated workflows:

- **Test**: Runs tests on Python 3.8-3.12 for all PRs and pushes
- **Release**: Automatically publishes to PyPI when a GitHub release is created or manually triggered
  - Version is automatically extracted from the git tag
  - Supports automatic version bumping
  - No manual version management needed

## Setup

### Environment Variables

You need to setup environment variables with the following keys:

```bash
OPENAI_API_KEY="sk-..."
ARATO_API_URL="https://api.arato.ai/..."
ARATO_API_KEY="ar-..."
```

You can create a `.env` file or set them in your environment:

```bash
# Generate template .env file
arato-proxy --init-env
```

### Configuration

The proxy uses a YAML configuration file (`config.yaml`). You can generate a template:

```bash
# Generate template config.yaml
arato-proxy --init-config
```

## CLI Usage

The `arato-proxy` command provides several options:

```bash
# Show help
arato-proxy --help

# Start with default settings
arato-proxy

# Start with custom configuration
arato-proxy --config custom.yaml --port 8080

# Generate template files
arato-proxy --init-config    # Creates config.yaml
arato-proxy --init-env       # Creates .env
```

### Known Issue: Callback Loading

Currently, the Arato callback functionality has a compatibility issue when running via uvx. The proxy will start and work with basic LiteLLM functionality, but the Arato-specific logging callbacks may not load properly in packaged environments. This is being investigated.

For now, you can use the proxy for basic LiteLLM functionality, and the callback issue will be resolved in a future update.

### Required Environment Variables

- `ARATO_API_URL` - URL of the Arato API endpoint
- `ARATO_API_KEY` - API key for authentication  
- `OPENAI_API_KEY` - OpenAI API key for the proxy

## Development

### From Source

```bash
git clone https://github.com/AratoAi/litellm-proxy
cd litellm
python3 -m venv .venv
source .venv/bin/activate
uv pip install -e .
```

### Running Tests

```bash
python -m pytest test_custom_callbacks.py -v
```

### Building the Package

```bash
uv build
```

### Installing in Development Mode

```bash
uv pip install -e .
```

## Example usage with n8n

In n8n, create a new OpenAI integration and set the base URL to point to your LiteLLM endpoint

![n8n](n8n.png)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "arato-proxy",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "arato, litellm, llm, logging, openai, proxy",
    "author": null,
    "author_email": "Arato <dev@arato.com>",
    "download_url": "https://files.pythonhosted.org/packages/ad/68/7e17df688c3a697d1a0bb4cf2bec182d0a408717bc017f1eb74294d0271a/arato_proxy-0.3.1.tar.gz",
    "platform": null,
    "description": "# LiteLLM based proxy for Arato Logs\n\nUse this [LiteLLM](https://www.litellm.ai/) proxy to send logs data to an Arato monitoring endpoint.\nAll LLM calls going through the proxy will be logged in Arato.\n\n## Quick Start\n\n### Using uvx (Recommended)\n\nSince this package is not yet published to PyPI, you can use uvx with the local wheel:\n\n```bash\n# First, build the package\nuv build\n\n# Run directly without installation using the local wheel\nuvx --from ./dist/arato_proxy-*.whl arato-proxy\n\n# Or with custom configuration\nuvx --from ./dist/arato_proxy-*.whl arato-proxy --config custom.yaml --port 8080\n```\n\n### Using uv/pip (Development Install)\n\n```bash\n# Install from source in development mode\nuv pip install -e .\n\n# Or install from the built wheel\nuv pip install ./dist/arato_proxy-*.whl\n\n# Then run\narato-proxy\n```\n\n### Publishing to PyPI (For Maintainers)\n\nTo publish a new version to PyPI, you have two options:\n\n#### Option 1: GitHub Release (Recommended)\n\n1. Create a GitHub release with a tag like `v0.3.0`\n2. This automatically triggers the release workflow\n\n#### Option 2: Manual Workflow Dispatch\n\n1. Go to Actions \u2192 \"Build and Publish to PyPI\" \u2192 \"Run workflow\"\n2. Choose version bump type:\n   - `patch`: 0.2.0 \u2192 0.2.1\n   - `minor`: 0.2.0 \u2192 0.3.0\n   - `major`: 0.2.0 \u2192 1.0.0\n   - Or specify exact version like `0.5.0`\n\nThe GitHub Actions workflow will:\n\n1. Extract/calculate the version from the git tag or bump type\n2. Update the version in `pyproject.toml`\n3. Create a new tag (for manual dispatch)\n4. Build the package\n5. Publish to PyPI using trusted publishing\n\nOnce published to PyPI, users will be able to run:\n\n```bash\n# After publishing to PyPI\nuvx arato-proxy\npip install arato-proxy\n```\n\n### GitHub Actions\n\nThis repository includes automated workflows:\n\n- **Test**: Runs tests on Python 3.8-3.12 for all PRs and pushes\n- **Release**: Automatically publishes to PyPI when a GitHub release is created or manually triggered\n  - Version is automatically extracted from the git tag\n  - Supports automatic version bumping\n  - No manual version management needed\n\n## Setup\n\n### Environment Variables\n\nYou need to setup environment variables with the following keys:\n\n```bash\nOPENAI_API_KEY=\"sk-...\"\nARATO_API_URL=\"https://api.arato.ai/...\"\nARATO_API_KEY=\"ar-...\"\n```\n\nYou can create a `.env` file or set them in your environment:\n\n```bash\n# Generate template .env file\narato-proxy --init-env\n```\n\n### Configuration\n\nThe proxy uses a YAML configuration file (`config.yaml`). You can generate a template:\n\n```bash\n# Generate template config.yaml\narato-proxy --init-config\n```\n\n## CLI Usage\n\nThe `arato-proxy` command provides several options:\n\n```bash\n# Show help\narato-proxy --help\n\n# Start with default settings\narato-proxy\n\n# Start with custom configuration\narato-proxy --config custom.yaml --port 8080\n\n# Generate template files\narato-proxy --init-config    # Creates config.yaml\narato-proxy --init-env       # Creates .env\n```\n\n### Known Issue: Callback Loading\n\nCurrently, the Arato callback functionality has a compatibility issue when running via uvx. The proxy will start and work with basic LiteLLM functionality, but the Arato-specific logging callbacks may not load properly in packaged environments. This is being investigated.\n\nFor now, you can use the proxy for basic LiteLLM functionality, and the callback issue will be resolved in a future update.\n\n### Required Environment Variables\n\n- `ARATO_API_URL` - URL of the Arato API endpoint\n- `ARATO_API_KEY` - API key for authentication  \n- `OPENAI_API_KEY` - OpenAI API key for the proxy\n\n## Development\n\n### From Source\n\n```bash\ngit clone https://github.com/AratoAi/litellm-proxy\ncd litellm\npython3 -m venv .venv\nsource .venv/bin/activate\nuv pip install -e .\n```\n\n### Running Tests\n\n```bash\npython -m pytest test_custom_callbacks.py -v\n```\n\n### Building the Package\n\n```bash\nuv build\n```\n\n### Installing in Development Mode\n\n```bash\nuv pip install -e .\n```\n\n## Example usage with n8n\n\nIn n8n, create a new OpenAI integration and set the base URL to point to your LiteLLM endpoint\n\n![n8n](n8n.png)\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "LiteLLM proxy for Arato logs",
    "version": "0.3.1",
    "project_urls": {
        "Documentation": "https://github.com/AratoAi/litellm-proxy#readme",
        "Homepage": "https://github.com/AratoAi/litellm-proxy",
        "Issues": "https://github.com/AratoAi/litellm-proxy/issues",
        "Repository": "https://github.com/AratoAi/litellm-proxy"
    },
    "split_keywords": [
        "arato",
        " litellm",
        " llm",
        " logging",
        " openai",
        " proxy"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f2aaba6125465ed4fc937c3fd52e797749ff90c710b1f7ed2b8eebf165fcba2d",
                "md5": "be73851eeb820c87bc2a32446a14729e",
                "sha256": "28468aba743d3508c71fbf4fd464852c799bb9fff47443de63f580923c3ec67d"
            },
            "downloads": -1,
            "filename": "arato_proxy-0.3.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "be73851eeb820c87bc2a32446a14729e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 10493,
            "upload_time": "2025-07-16T07:16:52",
            "upload_time_iso_8601": "2025-07-16T07:16:52.508232Z",
            "url": "https://files.pythonhosted.org/packages/f2/aa/ba6125465ed4fc937c3fd52e797749ff90c710b1f7ed2b8eebf165fcba2d/arato_proxy-0.3.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ad687e17df688c3a697d1a0bb4cf2bec182d0a408717bc017f1eb74294d0271a",
                "md5": "8e43d206fa7718088af28240a492c237",
                "sha256": "df35a8ca21e5e2f269ca729edb76eb96f3156922febdb26e4dad0cbeb846a897"
            },
            "downloads": -1,
            "filename": "arato_proxy-0.3.1.tar.gz",
            "has_sig": false,
            "md5_digest": "8e43d206fa7718088af28240a492c237",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 64415,
            "upload_time": "2025-07-16T07:16:53",
            "upload_time_iso_8601": "2025-07-16T07:16:53.778568Z",
            "url": "https://files.pythonhosted.org/packages/ad/68/7e17df688c3a697d1a0bb4cf2bec182d0a408717bc017f1eb74294d0271a/arato_proxy-0.3.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-16 07:16:53",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "AratoAi",
    "github_project": "litellm-proxy#readme",
    "github_not_found": true,
    "lcname": "arato-proxy"
}
        
Elapsed time: 0.43316s