ollama-instructor


Nameollama-instructor JSON
Version 1.0.0 PyPI version JSON
download
home_pageNone
SummaryValidate structured outputs from LLMs with Ollama and automatic retries.
upload_time2024-12-21 12:41:43
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords instructor json json-schema llm local-llm ollama prompting pydantic validation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ollama-instructor

`ollama-instructor` is a lightweight Python library that provides a convenient wrapper around the Ollama Client, extending it with validation features for obtaining valid JSON responses from Large Language Models (LLMs). Utilizing Pydantic, `ollama-instructor` ensures that responses from LLMs adhere to defined schemas.

[![Downloads](https://static.pepy.tech/badge/ollama-instructor/month)](https://pepy.tech/project/ollama-instructor)

> **Note**: This library depends on having [Ollama](https://ollama.com) installed and running. For more information, please refer to the official website of Ollama.

## Breaking Changes in Version 1.0.0

Version 1.0.0 introduces significant changes from version 0.5.2:
- Complete refactoring to directly inherit from Ollama's official Client classes
- Simplified API that aligns more closely with Ollama's native interface
- Improved logging system using Python's built-in logging module
- Streamlined validation process using Pydantic
- Removal of partial validation features to focus on core functionality
- New method names: `chat_completion` and `chat_stream` (previously `chat_completion_with_stream`)

## Features

- **Direct Integration**: Inherits directly from Ollama's official client for seamless integration
- **Schema Validation**: Uses Pydantic BaseModel to ensure valid JSON responses
- **Retry Mechanism**: Automatically retries failed validations with configurable attempts
- **Logging**: Comprehensive logging system with configurable levels
- **Async Support**: Full async/await support through `OllamaInstructorAsync`

## Installation

```bash
pip install ollama-instructor
```

## Quick Start

For streaming examples click [here](examples/)

### Synchronous Usage:
```python
from pydantic import BaseModel
from ollama_instructor import OllamaInstructor

class FriendInfo(BaseModel):
    name: str
    age: int
    is_available: bool

class FriendList(BaseModel):
    friends: list[FriendInfo]

# Create client with logging enabled
client = OllamaInstructor(enable_logging=True, log_level='DEBUG')

# Chat completion
response = client.chat_completion(
    format=FriendList,
    model='llama2:latest',
    messages=[
        {
            'role': 'user',
            'content': 'I have two friends: John (25, available) and Mary (30, busy)'
        }
    ]
)
```

### Asynchronous Usage:
```python
import asyncio
from pydantic import BaseModel
from ollama_instructor import OllamaInstructorAsync

class FriendInfo(BaseModel):
    name: str
    age: int
    is_available: bool

async def main():
    client = OllamaInstructorAsync(enable_logging=True)

    response = await client.chat_completion(
        format=FriendInfo,
        model='llama2:latest',
        messages=[
            {
                'role': 'user',
                'content': 'John is 25 years old and available to hang out'
            }
        ]
    )

if __name__ == "__main__":
    asyncio.run(main())
```

## Logging

The library includes comprehensive logging capabilities. You can enable and configure logging when initializing the client:

```python
client = OllamaInstructor(
    enable_logging=True,
    log_level="DEBUG",  # Options: DEBUG, INFO, WARNING, ERROR, CRITICAL
    log_format="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
)
```

## Support and Community

If you need help or want to discuss `ollama-instructor`, feel free to:
- Open an issue on GitHub
- Start a discussion in the GitHub repository
- Contact via email: lennartpollvogt@protonmail.com

Contributions and feedback are always welcome! 😊

## License

`ollama-instructor` is released under the MIT License. See the [LICENSE](LICENSE) file for more details.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "ollama-instructor",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "instructor, json, json-schema, llm, local-llm, ollama, prompting, pydantic, validation",
    "author": null,
    "author_email": "Lennart Pollvogt <lennartpollvogt@protonmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/4d/30/5025b72bb6e2aad7c8e7f9b62aa458ba9b0de6e5b6c99aad6996ae15dc7b/ollama_instructor-1.0.0.tar.gz",
    "platform": null,
    "description": "# ollama-instructor\n\n`ollama-instructor` is a lightweight Python library that provides a convenient wrapper around the Ollama Client, extending it with validation features for obtaining valid JSON responses from Large Language Models (LLMs). Utilizing Pydantic, `ollama-instructor` ensures that responses from LLMs adhere to defined schemas.\n\n[![Downloads](https://static.pepy.tech/badge/ollama-instructor/month)](https://pepy.tech/project/ollama-instructor)\n\n> **Note**: This library depends on having [Ollama](https://ollama.com) installed and running. For more information, please refer to the official website of Ollama.\n\n## Breaking Changes in Version 1.0.0\n\nVersion 1.0.0 introduces significant changes from version 0.5.2:\n- Complete refactoring to directly inherit from Ollama's official Client classes\n- Simplified API that aligns more closely with Ollama's native interface\n- Improved logging system using Python's built-in logging module\n- Streamlined validation process using Pydantic\n- Removal of partial validation features to focus on core functionality\n- New method names: `chat_completion` and `chat_stream` (previously `chat_completion_with_stream`)\n\n## Features\n\n- **Direct Integration**: Inherits directly from Ollama's official client for seamless integration\n- **Schema Validation**: Uses Pydantic BaseModel to ensure valid JSON responses\n- **Retry Mechanism**: Automatically retries failed validations with configurable attempts\n- **Logging**: Comprehensive logging system with configurable levels\n- **Async Support**: Full async/await support through `OllamaInstructorAsync`\n\n## Installation\n\n```bash\npip install ollama-instructor\n```\n\n## Quick Start\n\nFor streaming examples click [here](examples/)\n\n### Synchronous Usage:\n```python\nfrom pydantic import BaseModel\nfrom ollama_instructor import OllamaInstructor\n\nclass FriendInfo(BaseModel):\n    name: str\n    age: int\n    is_available: bool\n\nclass FriendList(BaseModel):\n    friends: list[FriendInfo]\n\n# Create client with logging enabled\nclient = OllamaInstructor(enable_logging=True, log_level='DEBUG')\n\n# Chat completion\nresponse = client.chat_completion(\n    format=FriendList,\n    model='llama2:latest',\n    messages=[\n        {\n            'role': 'user',\n            'content': 'I have two friends: John (25, available) and Mary (30, busy)'\n        }\n    ]\n)\n```\n\n### Asynchronous Usage:\n```python\nimport asyncio\nfrom pydantic import BaseModel\nfrom ollama_instructor import OllamaInstructorAsync\n\nclass FriendInfo(BaseModel):\n    name: str\n    age: int\n    is_available: bool\n\nasync def main():\n    client = OllamaInstructorAsync(enable_logging=True)\n\n    response = await client.chat_completion(\n        format=FriendInfo,\n        model='llama2:latest',\n        messages=[\n            {\n                'role': 'user',\n                'content': 'John is 25 years old and available to hang out'\n            }\n        ]\n    )\n\nif __name__ == \"__main__\":\n    asyncio.run(main())\n```\n\n## Logging\n\nThe library includes comprehensive logging capabilities. You can enable and configure logging when initializing the client:\n\n```python\nclient = OllamaInstructor(\n    enable_logging=True,\n    log_level=\"DEBUG\",  # Options: DEBUG, INFO, WARNING, ERROR, CRITICAL\n    log_format=\"%(asctime)s - %(name)s - %(levelname)s - %(message)s\"\n)\n```\n\n## Support and Community\n\nIf you need help or want to discuss `ollama-instructor`, feel free to:\n- Open an issue on GitHub\n- Start a discussion in the GitHub repository\n- Contact via email: lennartpollvogt@protonmail.com\n\nContributions and feedback are always welcome! \ud83d\ude0a\n\n## License\n\n`ollama-instructor` is released under the MIT License. See the [LICENSE](LICENSE) file for more details.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Validate structured outputs from LLMs with Ollama and automatic retries.",
    "version": "1.0.0",
    "project_urls": null,
    "split_keywords": [
        "instructor",
        " json",
        " json-schema",
        " llm",
        " local-llm",
        " ollama",
        " prompting",
        " pydantic",
        " validation"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "431aa832545c4907c3db59a494adce3d0a0a5836029f1afc87561eec3961d6d8",
                "md5": "4d331b151ceb07277d4422e97122af57",
                "sha256": "8388e320dc725f06e9bcf2be294be4c7404c45b4cd6b4db1bed58aad6b3afc01"
            },
            "downloads": -1,
            "filename": "ollama_instructor-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4d331b151ceb07277d4422e97122af57",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 6426,
            "upload_time": "2024-12-21T12:41:41",
            "upload_time_iso_8601": "2024-12-21T12:41:41.233904Z",
            "url": "https://files.pythonhosted.org/packages/43/1a/a832545c4907c3db59a494adce3d0a0a5836029f1afc87561eec3961d6d8/ollama_instructor-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "4d305025b72bb6e2aad7c8e7f9b62aa458ba9b0de6e5b6c99aad6996ae15dc7b",
                "md5": "da8cc14ee53ada4c33fa6835f38b9c66",
                "sha256": "ad2c893397ee11422dec7a637b02ec2ef14e67b17282a2978334ad45b7b69180"
            },
            "downloads": -1,
            "filename": "ollama_instructor-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "da8cc14ee53ada4c33fa6835f38b9c66",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 24823,
            "upload_time": "2024-12-21T12:41:43",
            "upload_time_iso_8601": "2024-12-21T12:41:43.628733Z",
            "url": "https://files.pythonhosted.org/packages/4d/30/5025b72bb6e2aad7c8e7f9b62aa458ba9b0de6e5b6c99aad6996ae15dc7b/ollama_instructor-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-21 12:41:43",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "ollama-instructor"
}
        
Elapsed time: 0.58007s