oai2ollama


Nameoai2ollama JSON
Version 1.0.7 PyPI version JSON
download
home_pageNone
SummaryNone
upload_time2025-07-09 03:34:26
maintainerNone
docs_urlNone
authorMuspi Merol
requires_python>=3.12
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Oai2Ollama

This is a CLI tool that starts a server that wraps an OpenAI-compatible API and expose an Ollama-compatible API,
which is useful for providing custom models for coding agents that don't support custom OpenAI APIs but do support Ollama
(like GitHub Copilot for VS Code).

## Usage

### with Python

You can run directly via `uvx` (if you have `uv` installed) or `pipx`:

```sh
uvx oai2ollama --help
```

```text
usage: oai2ollama [--api-key str] [--base-url HttpUrl]
options:
  --help              Show this help message and exit
  --api-key str       API key for authentication (required)
  --base-url HttpUrl  Base URL for the OpenAI-compatible API (required)
```

Or you can use a `.env` file to set the environment variables:

```properties
OPENAI_API_KEY=your_api_key
OPENAI_BASE_URL=your_base_url
```

### with Docker

First, build the image:

```sh
docker build -t oai2ollama .
```

Then, run the container with your credentials:

```sh
docker run -p 11434:11434 \
  -e OPENAI_API_KEY="your_api_key" \
  -e OPENAI_BASE_URL="your_base_url" \
  oai2ollama
```

Or you can pass these as command line arguments:

```sh
docker run -p 11434:11434 oai2ollama --api-key your_api_key --base-url your_base_url
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "oai2ollama",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.12",
    "maintainer_email": null,
    "keywords": null,
    "author": "Muspi Merol",
    "author_email": "Muspi Merol <me@promplate.dev>",
    "download_url": "https://files.pythonhosted.org/packages/48/83/4d93e33bb23ff26204b7c40a424b557acda489af022ba6af43bcc1f61055/oai2ollama-1.0.7.tar.gz",
    "platform": null,
    "description": "# Oai2Ollama\n\nThis is a CLI tool that starts a server that wraps an OpenAI-compatible API and expose an Ollama-compatible API,\nwhich is useful for providing custom models for coding agents that don't support custom OpenAI APIs but do support Ollama\n(like GitHub Copilot for VS Code).\n\n## Usage\n\n### with Python\n\nYou can run directly via `uvx` (if you have `uv` installed) or `pipx`:\n\n```sh\nuvx oai2ollama --help\n```\n\n```text\nusage: oai2ollama [--api-key str] [--base-url HttpUrl]\noptions:\n  --help              Show this help message and exit\n  --api-key str       API key for authentication (required)\n  --base-url HttpUrl  Base URL for the OpenAI-compatible API (required)\n```\n\nOr you can use a `.env` file to set the environment variables:\n\n```properties\nOPENAI_API_KEY=your_api_key\nOPENAI_BASE_URL=your_base_url\n```\n\n### with Docker\n\nFirst, build the image:\n\n```sh\ndocker build -t oai2ollama .\n```\n\nThen, run the container with your credentials:\n\n```sh\ndocker run -p 11434:11434 \\\n  -e OPENAI_API_KEY=\"your_api_key\" \\\n  -e OPENAI_BASE_URL=\"your_base_url\" \\\n  oai2ollama\n```\n\nOr you can pass these as command line arguments:\n\n```sh\ndocker run -p 11434:11434 oai2ollama --api-key your_api_key --base-url your_base_url\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": null,
    "version": "1.0.7",
    "project_urls": {
        "repository": "https://github.com/CNSeniorious000/oai2ollama"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "1c02dc6accb88ffafbab86fa3dcfa491eee20c3cb2bf9b383396b833c0381760",
                "md5": "c0665dc2b3e7e5a364c50c29baceba30",
                "sha256": "004f14c3c29fbebe9a8bfeea8fd9cf0fa4be512bc55c34b4800be052109b3f01"
            },
            "downloads": -1,
            "filename": "oai2ollama-1.0.7-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c0665dc2b3e7e5a364c50c29baceba30",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.12",
            "size": 3765,
            "upload_time": "2025-07-09T03:34:25",
            "upload_time_iso_8601": "2025-07-09T03:34:25.234668Z",
            "url": "https://files.pythonhosted.org/packages/1c/02/dc6accb88ffafbab86fa3dcfa491eee20c3cb2bf9b383396b833c0381760/oai2ollama-1.0.7-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "48834d93e33bb23ff26204b7c40a424b557acda489af022ba6af43bcc1f61055",
                "md5": "69ff41f2b63df045572bf403bc77f83f",
                "sha256": "f21c78af3fa4f98bf02550d1930f386eaadfe717cd5c4a5a32c1b0c0d14539ea"
            },
            "downloads": -1,
            "filename": "oai2ollama-1.0.7.tar.gz",
            "has_sig": false,
            "md5_digest": "69ff41f2b63df045572bf403bc77f83f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.12",
            "size": 2482,
            "upload_time": "2025-07-09T03:34:26",
            "upload_time_iso_8601": "2025-07-09T03:34:26.497957Z",
            "url": "https://files.pythonhosted.org/packages/48/83/4d93e33bb23ff26204b7c40a424b557acda489af022ba6af43bcc1f61055/oai2ollama-1.0.7.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-09 03:34:26",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "CNSeniorious000",
    "github_project": "oai2ollama",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "oai2ollama"
}
        
Elapsed time: 1.67172s