sibila


Namesibila JSON
Version 0.4.2 PyPI version JSON
download
home_pageNone
SummaryStructured queries from local or online LLM models
upload_time2024-05-04 18:40:09
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT
keywords llama.cpp ai transformers gpt llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Sibila

Extract structured data from remote or local LLM models. Predictable output is important for serious use of LLMs.

- Query structured data into Pydantic objects, dataclasses or simple types.
- Access remote models from OpenAI, Anthropic, Mistral AI and other providers.
- Use local models like Llama-3, Phi-3, OpenChat or any other GGUF file model.
- Besides structured extraction, Sibila is also a general purpose model access library, to generate plain text or free JSON results, with the same API for local and remote models.
- Model management: download models, manage configuration, quickly switch between models.

No matter how well you craft a prompt begging a model for the output you need, it can always respond something else. Extracting structured data can be a big step into getting predictable behavior from your models.

See [What can you do with Sibila?](https://jndiogo.github.io/sibila/what/)

To extract structured data from a local model:

``` python
from sibila import Models
from pydantic import BaseModel

class Info(BaseModel):
    event_year: int
    first_name: str
    last_name: str
    age_at_the_time: int
    nationality: str

model = Models.create("llamacpp:openchat")

model.extract(Info, "Who was the first man in the moon?")
```

Returns an instance of class Info, created from the model's output:

``` python
Info(event_year=1969,
     first_name='Neil',
     last_name='Armstrong',
     age_at_the_time=38,
     nationality='American')
```

Or to use a remote model like OpenAI's GPT-4, we would simply replace the model's name:

``` python
model = Models.create("openai:gpt-4")

model.extract(Info, "Who was the first man in the moon?")
```

If Pydantic BaseModel objects are too much for your project, Sibila supports similar functionality with Python dataclass. Also includes asynchronous access to remote models.




## Docs

[The docs explain](https://jndiogo.github.io/sibila/) the main concepts, include examples and an API reference.


## Installation

Sibila can be installed from PyPI by doing:

```
pip install --upgrade sibila
```

See [Getting started](https://jndiogo.github.io/sibila/installing/) for more information.



## Examples

The [Examples](https://jndiogo.github.io/sibila/examples/) show what you can do with local or remote models in Sibila: structured data extraction, classification, summarization, etc.



## License

This project is licensed under the MIT License - see the [LICENSE](https://github.com/jndiogo/sibila/blob/main/LICENSE) file for details.


## Acknowledgements

Sibila wouldn't be be possible without the help of great software and people:

- [llama.cpp](https://github.com/ggerganov/llama.cpp)
- [llama-cpp-python](https://github.com/abetlen/llama-cpp-python)
- [OpenAI Python API](https://github.com/openai/openai-python)
- [TheBloke (Tom Jobbins)](https://huggingface.co/TheBloke) and [Hugging Face model hub](https://huggingface.co/)

Thank you!


## Sibila?

Sibila is the Portuguese word for Sibyl. [The Sibyls](https://en.wikipedia.org/wiki/Sibyl) were wise oracular women in ancient Greece. Their mysterious words puzzled people throughout the centuries, providing insight or prophetic predictions, "uttering things not to be laughed at".

![Michelangelo's Delphic Sibyl, Sistine Chapel ceiling](https://upload.wikimedia.org/wikipedia/commons/thumb/1/19/DelphicSibylByMichelangelo.jpg/471px-DelphicSibylByMichelangelo.jpg)

Michelangelo's Delphic Sibyl, in the Sistine Chapel ceiling.


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "sibila",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "llama.cpp, AI, Transformers, GPT, LLM",
    "author": null,
    "author_email": "Jorge Diogo <jndiogo@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/fa/02/26f632e75403cfe3fd104e891905867b6247008dae6e71107343e90ce2a4/sibila-0.4.2.tar.gz",
    "platform": null,
    "description": "# Sibila\n\nExtract structured data from remote or local LLM models. Predictable output is important for serious use of LLMs.\n\n- Query structured data into Pydantic objects, dataclasses or simple types.\n- Access remote models from OpenAI, Anthropic, Mistral AI and other providers.\n- Use local models like Llama-3, Phi-3, OpenChat or any other GGUF file model.\n- Besides structured extraction, Sibila is also a general purpose model access library, to generate plain text or free JSON results, with the same API for local and remote models.\n- Model management: download models, manage configuration, quickly switch between models.\n\nNo matter how well you craft a prompt begging a model for the output you need, it can always respond something else. Extracting structured data can be a big step into getting predictable behavior from your models.\n\nSee [What can you do with Sibila?](https://jndiogo.github.io/sibila/what/)\n\nTo extract structured data from a local model:\n\n``` python\nfrom sibila import Models\nfrom pydantic import BaseModel\n\nclass Info(BaseModel):\n    event_year: int\n    first_name: str\n    last_name: str\n    age_at_the_time: int\n    nationality: str\n\nmodel = Models.create(\"llamacpp:openchat\")\n\nmodel.extract(Info, \"Who was the first man in the moon?\")\n```\n\nReturns an instance of class Info, created from the model's output:\n\n``` python\nInfo(event_year=1969,\n     first_name='Neil',\n     last_name='Armstrong',\n     age_at_the_time=38,\n     nationality='American')\n```\n\nOr to use a remote model like OpenAI's GPT-4, we would simply replace the model's name:\n\n``` python\nmodel = Models.create(\"openai:gpt-4\")\n\nmodel.extract(Info, \"Who was the first man in the moon?\")\n```\n\nIf Pydantic BaseModel objects are too much for your project, Sibila supports similar functionality with Python dataclass. Also includes asynchronous access to remote models.\n\n\n\n\n## Docs\n\n[The docs explain](https://jndiogo.github.io/sibila/) the main concepts, include examples and an API reference.\n\n\n## Installation\n\nSibila can be installed from PyPI by doing:\n\n```\npip install --upgrade sibila\n```\n\nSee [Getting started](https://jndiogo.github.io/sibila/installing/) for more information.\n\n\n\n## Examples\n\nThe [Examples](https://jndiogo.github.io/sibila/examples/) show what you can do with local or remote models in Sibila: structured data extraction, classification, summarization, etc.\n\n\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](https://github.com/jndiogo/sibila/blob/main/LICENSE) file for details.\n\n\n## Acknowledgements\n\nSibila wouldn't be be possible without the help of great software and people:\n\n- [llama.cpp](https://github.com/ggerganov/llama.cpp)\n- [llama-cpp-python](https://github.com/abetlen/llama-cpp-python)\n- [OpenAI Python API](https://github.com/openai/openai-python)\n- [TheBloke (Tom Jobbins)](https://huggingface.co/TheBloke) and [Hugging Face model hub](https://huggingface.co/)\n\nThank you!\n\n\n## Sibila?\n\nSibila is the Portuguese word for Sibyl. [The Sibyls](https://en.wikipedia.org/wiki/Sibyl) were wise oracular women in ancient Greece. Their mysterious words puzzled people throughout the centuries, providing insight or prophetic predictions, \"uttering things not to be laughed at\".\n\n![Michelangelo's Delphic Sibyl, Sistine Chapel ceiling](https://upload.wikimedia.org/wikipedia/commons/thumb/1/19/DelphicSibylByMichelangelo.jpg/471px-DelphicSibylByMichelangelo.jpg)\n\nMichelangelo's Delphic Sibyl, in the Sistine Chapel ceiling.\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Structured queries from local or online LLM models",
    "version": "0.4.2",
    "project_urls": {
        "Documentation": "https://jndiogo.github.io/sibila",
        "Homepage": "https://github.com/jndiogo/sibila",
        "Issues": "https://github.com/jndiogo/sibila/issues"
    },
    "split_keywords": [
        "llama.cpp",
        " ai",
        " transformers",
        " gpt",
        " llm"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "05a954d3efffec2c57906377d36cb4336ba8ed73cf0ea74f4371c0c3621e9c8c",
                "md5": "1ea8f9c5c15a4fbd00b04ffc011e4663",
                "sha256": "b3af13f7de1da782a3aea355a93f37c8e0fab9a24209e534f398654c2ea9fb8f"
            },
            "downloads": -1,
            "filename": "sibila-0.4.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1ea8f9c5c15a4fbd00b04ffc011e4663",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 89384,
            "upload_time": "2024-05-04T18:40:06",
            "upload_time_iso_8601": "2024-05-04T18:40:06.530252Z",
            "url": "https://files.pythonhosted.org/packages/05/a9/54d3efffec2c57906377d36cb4336ba8ed73cf0ea74f4371c0c3621e9c8c/sibila-0.4.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fa0226f632e75403cfe3fd104e891905867b6247008dae6e71107343e90ce2a4",
                "md5": "25a4b3f4290ae1278d3ce5dcf69c2748",
                "sha256": "58249bc78ae838d59c9ee0689ad7473d15f09f09fa28af08b0cc06fe724954df"
            },
            "downloads": -1,
            "filename": "sibila-0.4.2.tar.gz",
            "has_sig": false,
            "md5_digest": "25a4b3f4290ae1278d3ce5dcf69c2748",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 99566,
            "upload_time": "2024-05-04T18:40:09",
            "upload_time_iso_8601": "2024-05-04T18:40:09.001838Z",
            "url": "https://files.pythonhosted.org/packages/fa/02/26f632e75403cfe3fd104e891905867b6247008dae6e71107343e90ce2a4/sibila-0.4.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-04 18:40:09",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "jndiogo",
    "github_project": "sibila",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "sibila"
}
        
Elapsed time: 0.27722s