Name | sibila JSON |
Version |
0.4.1
JSON |
| download |
home_page | None |
Summary | Structured queries from local or online LLM models |
upload_time | 2024-04-29 17:51:25 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | MIT |
keywords |
llama.cpp
ai
transformers
gpt
llm
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Sibila
Extract structured data from remote or local LLM models. Predictable output is important for serious use of LLMs.
- Query structured data into Pydantic objects, dataclasses or simple types.
- Access remote models from OpenAI, Anthropic, Mistral AI and other providers.
- Use local models like Llama-3, Phi-3, OpenChat or any other GGUF file model.
- Besides structured extraction, Sibila is also a general purpose model access library, to generate plain text or free JSON results, with the same API for local and remote models.
- Model management: download models, manage configuration, quickly switch between models.
No matter how well you craft a prompt begging a model for the output you need, it can always respond something else. Extracting structured data can be a big step into getting predictable behavior from your models.
See [What can you do with Sibila?](https://jndiogo.github.io/sibila/what/)
To extract structured data from a local model:
``` python
from sibila import Models
from pydantic import BaseModel
class Info(BaseModel):
event_year: int
first_name: str
last_name: str
age_at_the_time: int
nationality: str
model = Models.create("llamacpp:openchat")
model.extract(Info, "Who was the first man in the moon?")
```
Returns an instance of class Info, created from the model's output:
``` python
Info(event_year=1969,
first_name='Neil',
last_name='Armstrong',
age_at_the_time=38,
nationality='American')
```
Or to use a remote model like OpenAI's GPT-4, we would simply replace the model's name:
``` python
model = Models.create("openai:gpt-4")
model.extract(Info, "Who was the first man in the moon?")
```
If Pydantic BaseModel objects are too much for your project, Sibila supports similar functionality with Python dataclass. Also includes asynchronous access to remote models.
## Docs
[The docs explain](https://jndiogo.github.io/sibila/) the main concepts, include examples and an API reference.
## Installation
Sibila can be installed from PyPI by doing:
```
pip install --upgrade sibila
```
See [Getting started](https://jndiogo.github.io/sibila/installing/) for more information.
## Examples
The [Examples](https://jndiogo.github.io/sibila/examples/) show what you can do with local or remote models in Sibila: structured data extraction, classification, summarization, etc.
## License
This project is licensed under the MIT License - see the [LICENSE](https://github.com/jndiogo/sibila/blob/main/LICENSE) file for details.
## Acknowledgements
Sibila wouldn't be be possible without the help of great software and people:
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
- [llama-cpp-python](https://github.com/abetlen/llama-cpp-python)
- [OpenAI Python API](https://github.com/openai/openai-python)
- [TheBloke (Tom Jobbins)](https://huggingface.co/TheBloke) and [Hugging Face model hub](https://huggingface.co/)
Thank you!
## Sibila?
Sibila is the Portuguese word for Sibyl. [The Sibyls](https://en.wikipedia.org/wiki/Sibyl) were wise oracular women in ancient Greece. Their mysterious words puzzled people throughout the centuries, providing insight or prophetic predictions, "uttering things not to be laughed at".
![Michelangelo's Delphic Sibyl, Sistine Chapel ceiling](https://upload.wikimedia.org/wikipedia/commons/thumb/1/19/DelphicSibylByMichelangelo.jpg/471px-DelphicSibylByMichelangelo.jpg)
Michelangelo's Delphic Sibyl, in the Sistine Chapel ceiling.
Raw data
{
"_id": null,
"home_page": null,
"name": "sibila",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "llama.cpp, AI, Transformers, GPT, LLM",
"author": null,
"author_email": "Jorge Diogo <jndiogo@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/a1/78/2c3874745e4b70bb3d5546ed18e68f7e01ca083ad95fcbf93941387eea57/sibila-0.4.1.tar.gz",
"platform": null,
"description": "# Sibila\n\nExtract structured data from remote or local LLM models. Predictable output is important for serious use of LLMs.\n\n- Query structured data into Pydantic objects, dataclasses or simple types.\n- Access remote models from OpenAI, Anthropic, Mistral AI and other providers.\n- Use local models like Llama-3, Phi-3, OpenChat or any other GGUF file model.\n- Besides structured extraction, Sibila is also a general purpose model access library, to generate plain text or free JSON results, with the same API for local and remote models.\n- Model management: download models, manage configuration, quickly switch between models.\n\nNo matter how well you craft a prompt begging a model for the output you need, it can always respond something else. Extracting structured data can be a big step into getting predictable behavior from your models.\n\nSee [What can you do with Sibila?](https://jndiogo.github.io/sibila/what/)\n\nTo extract structured data from a local model:\n\n``` python\nfrom sibila import Models\nfrom pydantic import BaseModel\n\nclass Info(BaseModel):\n event_year: int\n first_name: str\n last_name: str\n age_at_the_time: int\n nationality: str\n\nmodel = Models.create(\"llamacpp:openchat\")\n\nmodel.extract(Info, \"Who was the first man in the moon?\")\n```\n\nReturns an instance of class Info, created from the model's output:\n\n``` python\nInfo(event_year=1969,\n first_name='Neil',\n last_name='Armstrong',\n age_at_the_time=38,\n nationality='American')\n```\n\nOr to use a remote model like OpenAI's GPT-4, we would simply replace the model's name:\n\n``` python\nmodel = Models.create(\"openai:gpt-4\")\n\nmodel.extract(Info, \"Who was the first man in the moon?\")\n```\n\nIf Pydantic BaseModel objects are too much for your project, Sibila supports similar functionality with Python dataclass. Also includes asynchronous access to remote models.\n\n\n\n\n## Docs\n\n[The docs explain](https://jndiogo.github.io/sibila/) the main concepts, include examples and an API reference.\n\n\n## Installation\n\nSibila can be installed from PyPI by doing:\n\n```\npip install --upgrade sibila\n```\n\nSee [Getting started](https://jndiogo.github.io/sibila/installing/) for more information.\n\n\n\n## Examples\n\nThe [Examples](https://jndiogo.github.io/sibila/examples/) show what you can do with local or remote models in Sibila: structured data extraction, classification, summarization, etc.\n\n\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](https://github.com/jndiogo/sibila/blob/main/LICENSE) file for details.\n\n\n## Acknowledgements\n\nSibila wouldn't be be possible without the help of great software and people:\n\n- [llama.cpp](https://github.com/ggerganov/llama.cpp)\n- [llama-cpp-python](https://github.com/abetlen/llama-cpp-python)\n- [OpenAI Python API](https://github.com/openai/openai-python)\n- [TheBloke (Tom Jobbins)](https://huggingface.co/TheBloke) and [Hugging Face model hub](https://huggingface.co/)\n\nThank you!\n\n\n## Sibila?\n\nSibila is the Portuguese word for Sibyl. [The Sibyls](https://en.wikipedia.org/wiki/Sibyl) were wise oracular women in ancient Greece. Their mysterious words puzzled people throughout the centuries, providing insight or prophetic predictions, \"uttering things not to be laughed at\".\n\n![Michelangelo's Delphic Sibyl, Sistine Chapel ceiling](https://upload.wikimedia.org/wikipedia/commons/thumb/1/19/DelphicSibylByMichelangelo.jpg/471px-DelphicSibylByMichelangelo.jpg)\n\nMichelangelo's Delphic Sibyl, in the Sistine Chapel ceiling.\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Structured queries from local or online LLM models",
"version": "0.4.1",
"project_urls": {
"Documentation": "https://jndiogo.github.io/sibila",
"Homepage": "https://github.com/jndiogo/sibila",
"Issues": "https://github.com/jndiogo/sibila/issues"
},
"split_keywords": [
"llama.cpp",
" ai",
" transformers",
" gpt",
" llm"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "dc414858c64b7648cbe4723d3276f40309b26cc261b051e83c595ce1b68d41bb",
"md5": "be7d9842d61f7e90fd78e46c80adecb3",
"sha256": "adf811fb374b7c46674cc4dca56c55e377fa0dc4056cbca4b54baa4bdf02b832"
},
"downloads": -1,
"filename": "sibila-0.4.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "be7d9842d61f7e90fd78e46c80adecb3",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 89295,
"upload_time": "2024-04-29T17:51:23",
"upload_time_iso_8601": "2024-04-29T17:51:23.432842Z",
"url": "https://files.pythonhosted.org/packages/dc/41/4858c64b7648cbe4723d3276f40309b26cc261b051e83c595ce1b68d41bb/sibila-0.4.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "a1782c3874745e4b70bb3d5546ed18e68f7e01ca083ad95fcbf93941387eea57",
"md5": "531fd592edf3b6f3620600ecad2c0487",
"sha256": "af575ceb75890339a97ac27d8d67d97ee600210c52126a522a107235fa88e12f"
},
"downloads": -1,
"filename": "sibila-0.4.1.tar.gz",
"has_sig": false,
"md5_digest": "531fd592edf3b6f3620600ecad2c0487",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 99453,
"upload_time": "2024-04-29T17:51:25",
"upload_time_iso_8601": "2024-04-29T17:51:25.723594Z",
"url": "https://files.pythonhosted.org/packages/a1/78/2c3874745e4b70bb3d5546ed18e68f7e01ca083ad95fcbf93941387eea57/sibila-0.4.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-04-29 17:51:25",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "jndiogo",
"github_project": "sibila",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "sibila"
}