ollama


Nameollama JSON
Version 0.1.9 PyPI version JSON
download
home_pagehttps://ollama.ai
SummaryThe official Python client for Ollama.
upload_time2024-04-26 00:13:33
maintainerNone
docs_urlNone
authorOllama
requires_python<4.0,>=3.8
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Ollama Python Library

The Ollama Python library provides the easiest way to integrate Python 3.8+ projects with [Ollama](https://github.com/jmorganca/ollama).

## Install

```sh
pip install ollama
```

## Usage

```python
import ollama
response = ollama.chat(model='llama2', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])
print(response['message']['content'])
```

## Streaming responses

Response streaming can be enabled by setting `stream=True`, modifying function calls to return a Python generator where each part is an object in the stream.

```python
import ollama

stream = ollama.chat(
    model='llama2',
    messages=[{'role': 'user', 'content': 'Why is the sky blue?'}],
    stream=True,
)

for chunk in stream:
  print(chunk['message']['content'], end='', flush=True)
```

## API

The Ollama Python library's API is designed around the [Ollama REST API](https://github.com/jmorganca/ollama/blob/main/docs/api.md)

### Chat

```python
ollama.chat(model='llama2', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])
```

### Generate

```python
ollama.generate(model='llama2', prompt='Why is the sky blue?')
```

### List

```python
ollama.list()
```

### Show

```python
ollama.show('llama2')
```

### Create

```python
modelfile='''
FROM llama2
SYSTEM You are mario from super mario bros.
'''

ollama.create(model='example', modelfile=modelfile)
```

### Copy

```python
ollama.copy('llama2', 'user/llama2')
```

### Delete

```python
ollama.delete('llama2')
```

### Pull

```python
ollama.pull('llama2')
```

### Push

```python
ollama.push('user/llama2')
```

### Embeddings

```python
ollama.embeddings(model='llama2', prompt='The sky is blue because of rayleigh scattering')
```

## Custom client

A custom client can be created with the following fields:

- `host`: The Ollama host to connect to
- `timeout`: The timeout for requests

```python
from ollama import Client
client = Client(host='http://localhost:11434')
response = client.chat(model='llama2', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])
```

## Async client

```python
import asyncio
from ollama import AsyncClient

async def chat():
  message = {'role': 'user', 'content': 'Why is the sky blue?'}
  response = await AsyncClient().chat(model='llama2', messages=[message])

asyncio.run(chat())
```

Setting `stream=True` modifies functions to return a Python asynchronous generator:

```python
import asyncio
from ollama import AsyncClient

async def chat():
  message = {'role': 'user', 'content': 'Why is the sky blue?'}
  async for part in await AsyncClient().chat(model='llama2', messages=[message], stream=True):
    print(part['message']['content'], end='', flush=True)

asyncio.run(chat())
```

## Errors

Errors are raised if requests return an error status or if an error is detected while streaming.

```python
model = 'does-not-yet-exist'

try:
  ollama.chat(model)
except ollama.ResponseError as e:
  print('Error:', e.error)
  if e.status_code == 404:
    ollama.pull(model)
```


            

Raw data

            {
    "_id": null,
    "home_page": "https://ollama.ai",
    "name": "ollama",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "Ollama",
    "author_email": "hello@ollama.com",
    "download_url": "https://files.pythonhosted.org/packages/48/88/3b7094ddfc38a350a7fa8cce88ef584ca08cf127c9f5e4ce9e6504cd3b61/ollama-0.1.9.tar.gz",
    "platform": null,
    "description": "# Ollama Python Library\n\nThe Ollama Python library provides the easiest way to integrate Python 3.8+ projects with [Ollama](https://github.com/jmorganca/ollama).\n\n## Install\n\n```sh\npip install ollama\n```\n\n## Usage\n\n```python\nimport ollama\nresponse = ollama.chat(model='llama2', messages=[\n  {\n    'role': 'user',\n    'content': 'Why is the sky blue?',\n  },\n])\nprint(response['message']['content'])\n```\n\n## Streaming responses\n\nResponse streaming can be enabled by setting `stream=True`, modifying function calls to return a Python generator where each part is an object in the stream.\n\n```python\nimport ollama\n\nstream = ollama.chat(\n    model='llama2',\n    messages=[{'role': 'user', 'content': 'Why is the sky blue?'}],\n    stream=True,\n)\n\nfor chunk in stream:\n  print(chunk['message']['content'], end='', flush=True)\n```\n\n## API\n\nThe Ollama Python library's API is designed around the [Ollama REST API](https://github.com/jmorganca/ollama/blob/main/docs/api.md)\n\n### Chat\n\n```python\nollama.chat(model='llama2', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])\n```\n\n### Generate\n\n```python\nollama.generate(model='llama2', prompt='Why is the sky blue?')\n```\n\n### List\n\n```python\nollama.list()\n```\n\n### Show\n\n```python\nollama.show('llama2')\n```\n\n### Create\n\n```python\nmodelfile='''\nFROM llama2\nSYSTEM You are mario from super mario bros.\n'''\n\nollama.create(model='example', modelfile=modelfile)\n```\n\n### Copy\n\n```python\nollama.copy('llama2', 'user/llama2')\n```\n\n### Delete\n\n```python\nollama.delete('llama2')\n```\n\n### Pull\n\n```python\nollama.pull('llama2')\n```\n\n### Push\n\n```python\nollama.push('user/llama2')\n```\n\n### Embeddings\n\n```python\nollama.embeddings(model='llama2', prompt='The sky is blue because of rayleigh scattering')\n```\n\n## Custom client\n\nA custom client can be created with the following fields:\n\n- `host`: The Ollama host to connect to\n- `timeout`: The timeout for requests\n\n```python\nfrom ollama import Client\nclient = Client(host='http://localhost:11434')\nresponse = client.chat(model='llama2', messages=[\n  {\n    'role': 'user',\n    'content': 'Why is the sky blue?',\n  },\n])\n```\n\n## Async client\n\n```python\nimport asyncio\nfrom ollama import AsyncClient\n\nasync def chat():\n  message = {'role': 'user', 'content': 'Why is the sky blue?'}\n  response = await AsyncClient().chat(model='llama2', messages=[message])\n\nasyncio.run(chat())\n```\n\nSetting `stream=True` modifies functions to return a Python asynchronous generator:\n\n```python\nimport asyncio\nfrom ollama import AsyncClient\n\nasync def chat():\n  message = {'role': 'user', 'content': 'Why is the sky blue?'}\n  async for part in await AsyncClient().chat(model='llama2', messages=[message], stream=True):\n    print(part['message']['content'], end='', flush=True)\n\nasyncio.run(chat())\n```\n\n## Errors\n\nErrors are raised if requests return an error status or if an error is detected while streaming.\n\n```python\nmodel = 'does-not-yet-exist'\n\ntry:\n  ollama.chat(model)\nexcept ollama.ResponseError as e:\n  print('Error:', e.error)\n  if e.status_code == 404:\n    ollama.pull(model)\n```\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "The official Python client for Ollama.",
    "version": "0.1.9",
    "project_urls": {
        "Homepage": "https://ollama.ai",
        "Repository": "https://github.com/jmorganca/ollama-python"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fc3ea589e9e3702950e0babb1878155f9a731b1bc0e250cf33a2496ab70b9c1a",
                "md5": "1d48e83b43333e5b82a09317c7c11114",
                "sha256": "46d4b4028ac5cbb30a8128ba60967171fbb523c25c832c0f3f090989e5441033"
            },
            "downloads": -1,
            "filename": "ollama-0.1.9-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1d48e83b43333e5b82a09317c7c11114",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.8",
            "size": 9360,
            "upload_time": "2024-04-26T00:13:23",
            "upload_time_iso_8601": "2024-04-26T00:13:23.196076Z",
            "url": "https://files.pythonhosted.org/packages/fc/3e/a589e9e3702950e0babb1878155f9a731b1bc0e250cf33a2496ab70b9c1a/ollama-0.1.9-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "48883b7094ddfc38a350a7fa8cce88ef584ca08cf127c9f5e4ce9e6504cd3b61",
                "md5": "5a0c0d7bd33610006990c8f5f4b0de54",
                "sha256": "f64484b280db0fa03fb899580d8a3a85af3787fd2a85b67669b743e313b3faf1"
            },
            "downloads": -1,
            "filename": "ollama-0.1.9.tar.gz",
            "has_sig": false,
            "md5_digest": "5a0c0d7bd33610006990c8f5f4b0de54",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.8",
            "size": 9453,
            "upload_time": "2024-04-26T00:13:33",
            "upload_time_iso_8601": "2024-04-26T00:13:33.358726Z",
            "url": "https://files.pythonhosted.org/packages/48/88/3b7094ddfc38a350a7fa8cce88ef584ca08cf127c9f5e4ce9e6504cd3b61/ollama-0.1.9.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-26 00:13:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "jmorganca",
    "github_project": "ollama-python",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "ollama"
}
        
Elapsed time: 0.22988s