ai-microcore


Nameai-microcore JSON
Version 3.8.0 PyPI version JSON
download
home_pageNone
Summary# Minimalistic Foundation for AI Applications
upload_time2024-04-27 13:19:32
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords llm large language models ai similarity search ai search gpt openai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="right">
    <a href="https://github.com/Nayjest/ai-microcore/releases" target="_blank"><img src="https://img.shields.io/github/release/ai-microcore/microcore" alt="Release Notes"></a>
    <a href="https://app.codacy.com/gh/Nayjest/ai-microcore/dashboard?utm_source=gh&utm_medium=referral&utm_content=&utm_campaign=Badge_grade" target="_blank"><img src="https://app.codacy.com/project/badge/Grade/441d03416bc048828c649129530dcbc3" alt="Code Quality"></a>
    <a href="https://github.com/Nayjest/ai-microcore/actions/workflows/pylint.yml" target="_blank"><img src="https://github.com/Nayjest/ai-microcore/actions/workflows/pylint.yml/badge.svg" alt="Pylint"></a>
    <a href="https://github.com/Nayjest/ai-microcore/actions/workflows/tests.yml" target="_blank"><img src="https://github.com/Nayjest/ai-microcore/actions/workflows/tests.yml/badge.svg" alt="Tests"></a>
    <a href="https://github.com/Nayjest/ai-microcore/blob/main/LICENSE" target="_blank"><img src="https://img.shields.io/static/v1?label=license&message=MIT&color=d08aff" alt="License"></a>
</p>


# AI MicroCore: A Minimalistic Foundation for AI Applications

**microcore** is a collection of python adapters for Large Language Models
and Semantic Search APIs allowing to 
communicate with these services in a convenient way, make it easily switchable 
and separate business logic from implementation details.

It defines interfaces for features typically used in AI applications,
that allows you to keep your application as simple as possible and try various models & services
without need to change your application code.

You even can switch between text completion and chat completion models only using configuration.

The basic example of usage is as follows:

```python
from microcore import llm

while user_msg := input('Enter message: '):
    print('AI: ' + llm(user_msg))
```

## πŸ”— Links

 -   [API Reference](https://ai-microcore.github.io/api-reference/)
 -   [PyPi Package](https://pypi.org/project/ai-microcore/)
 -   [GitHub Repository](https://github.com/Nayjest/ai-microcore)


## πŸ’» Installation

Install as PyPi package:
```
pip install ai-microcore
```

Alternatively, you may just copy `microcore` folder to your project sources root.
```bash
git clone git@github.com:Nayjest/ai-microcore.git && mv ai-microcore/microcore ./ && rm -rf ai-microcore
```


## πŸ“‹ Requirements

Python 3.10+ / 3.11+

Both v0.28.X and v1.x.x OpenAI package versions are supported.


## βš™οΈ Configuring

### Minimal Configuration

Having `OPENAI_API_KEY` in OS environment variables is enough for basic usage.

Similarity search features will work out of the box if you have the `chromadb` pip package installed.

### Configuration Methods

There are a few options available for configuring microcore:

-   Use `microcore.configure()`
    <br>πŸ’‘ <small>All configuration options should be available in IDE autocompletion tooltips</small>
-   Create a `.env` file in your project root; examples: [basic.env](https://github.com/Nayjest/ai-microcore/blob/main/.env.example), [Mistral Large.env](https://github.com/Nayjest/ai-microcore/blob/main/.env.mistral.example), [Anthropic Claude 3 Opus.env](https://github.com/Nayjest/ai-microcore/blob/main/.env.anthropic.example), [Gemini on Vertex AI.env](https://github.com/Nayjest/ai-microcore/blob/main/.env.google-vertex-gemini.example), [Gemini on AI Studio.env](https://github.com/Nayjest/ai-microcore/blob/main/.env.gemini.example)
-   Use a custom configuration file: `mc.configure(DOT_ENV_FILE='dev-config.ini')`
-   Define OS environment variables

For the full list of available configuration options, you may also check [`microcore/config.py`](https://github.com/Nayjest/ai-microcore/blob/main/microcore/config.py).

### Installing vendor-specific packages
For the models working not via OpenAI API, you may need to install additional packages:
#### Anthropic Claude 3
```bash
pip install anthropic
```
#### Google Gemini via AI Studio
```bash
pip install google-generativeai
```
#### Google Gemini via Vertex AI
```bash
pip install vertexai
```
πŸ“ŒAdditonaly for working through [Vertex AI](https://cloud.google.com/vertex-ai) you need to
[install the Google Cloud CLI](https://cloud.google.com/sdk/docs/install)
and [configure the authorization](https://cloud.google.com/sdk/docs/authorizing).

### Priority of Configuration Sources

1.  Configuration options passed as arguments to `microcore.configure()` have the highest priority.
2.  The priority of configuration file options (`.env` by default or the value of `DOT_ENV_FILE`) is higher than OS environment variables.
    <br>πŸ’‘ <small>Setting `USE_DOT_ENV` to `false` disables reading configuration files.</small>
3.  OS environment variables have the lowest priority.


## 🌟 Core Functions

### llm(prompt: str, \*\*kwargs) β†’ str

Performs a request to a large language model (LLM).

Asynchronous variant: `allm(prompt: str, **kwargs)`

```python
from microcore import *

# Will print all requests and responses to console
use_logging()

# Basic usage
ai_response = llm('What is your model name?')

# You also may pass a list of strings as prompt
# - For chat completion models elements are treated as separate messages
# - For completion LLMs elements are treated as text lines
llm(['1+2', '='])
llm('1+2=', model='gpt-4')

# To specify a message role, you can use dictionary or classes
llm(dict(role='system', content='1+2='))
# equivalent
llm(SysMsg('1+2='))

# The returned value is a string
assert '7' == llm([
 SysMsg('You are a calculator'),
 UserMsg('1+2='),
 AssistantMsg('3'),
 UserMsg('3+4=')]
).strip()

# But it contains all fields of the LLM response in additional attributes
for i in llm('1+2=?', n=3, temperature=2).choices:
    print('RESPONSE:', i.message.content)

# To use response streaming you may specify the callback function:
llm('Hi there', callback=lambda x: print(x, end=''))

# Or multiple callbacks:
output = []
llm('Hi there', callbacks=[
    lambda x: print(x, end=''),
    lambda x: output.append(x),
])
```

### tpl(file_path, \*\*params) β†’ str
Renders prompt template with params.

Full-featured Jinja2 templates are used by default.

Related configuration options:

```python
from microcore import configure
configure(
    # 'tpl' folder in current working directory by default
    PROMPT_TEMPLATES_PATH = 'my_templates_folder'
)
```

### texts.search(collection: str, query: str | list, n_results: int = 5, where: dict = None, **kwargs) β†’ list[str]
Similarity search

### texts.find_one(self, collection: str, query: str | list) β†’ str | None
Find most similar text

### texts.get_all(self, collection: str) -> list[str]
Return collection of texts

### texts.save(collection: str, text: str, metadata: dict = None))
Store text and related metadata in embeddings database

### texts.save_many(collection: str, items: list[tuple[str, dict] | str])
Store multiple texts and related metadata in the embeddings database

### texts.clear(collection: str):
Clear collection

## API providers and models support

LLM Microcore supports all models & API providers having OpenAI API.

### List of API providers and models tested with LLM Microcore:

| API Provider                                                                             |                                                                                                                                      Models |
|------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------:|
| [OpenAI](https://openai.com)                                                             |                                    All GPT-4 and GTP-3.5-Turbo models<br/>all text completion models (davinci, gpt-3.5-turbo-instruct, etc) |
| [Microsoft Azure](https://azure.microsoft.com/en-us/products/ai-services/openai-service) |                                                                                                            All OpenAI models, Mistral Large |
| [Anthropic](https://anthropic.com)                                                       |                                                                                                                             Claude 3 models |
| [MistralAI](https://mistral.ai)                                                          |                                                                                                                          All Mistral models |
| [Google AI Studio](https://aistudio.google.com/)                             |                                                                                                                        Google Gemini models |
| [Google Vertex AI](https://cloud.google.com/vertex-ai?hl=en)                             |                                                   Gemini Pro & [other models](https://cloud.google.com/vertex-ai/docs/start/explore-models) |
| [Deep Infra](https://deepinfra.com)                                                      | deepinfra/airoboros-70b<br/>jondurbin/airoboros-l2-70b-gpt4-1.4.1<br/>meta-llama/Llama-2-70b-chat-hf<br/>and other models having OpenAI API |
| [Anyscale](https://anyscale.com)                                                         |                                           meta-llama/Llama-2-70b-chat-hf<br/>meta-llama/Llama-2-13b-chat-hf<br/>meta-llama/Llama-7b-chat-hf |
| [Groq](https://groq.com/)                                                         |                                           LLaMA2 70b<br>Mixtral 8x7b<br>Gemma 7b |
| [Fireworks](fireworks.ai)                                                         |                                           [Over 50 open-source language models](https://fireworks.ai/models?show=All) |


## πŸ–ΌοΈ Examples

#### [code-review-tool example](https://github.com/llm-microcore/microcore/blob/main/examples/code-review-tool)
Performs code review by LLM for changes in git .patch files in any programming languages.

#### [Other examples](https://github.com/llm-microcore/microcore/tree/main/examples)

## Python functions as AI tools

@TODO

## πŸ€– AI Modules
**This is experimental feature.**

Tweaks the Python import system to provide automatic setup of MicroCore environment
based on metadata in module docstrings.
### Usage:
```python
import microcore.ai_modules
```
### Features:

*   Automatically registers template folders of AI modules in Jinja2 environment


## πŸ› οΈ Contributing

Please see [CONTRIBUTING](https://github.com/Nayjest/ai-microcore/blob/main/CONTRIBUTING.md) for details.


## πŸ“ License

Licensed under the [MIT License](https://github.com/Nayjest/ai-microcore/blob/main/LICENSE)
Β© 2023 [Vitalii Stepanenko](mailto:mail@vitaliy.in)


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "ai-microcore",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "Vitalii Stepanenko <mail@vitalii.in>",
    "keywords": "llm, large language models, ai, similarity search, ai search, gpt, openai",
    "author": null,
    "author_email": "Vitalii Stepanenko <mail@vitalii.in>",
    "download_url": "https://files.pythonhosted.org/packages/fb/c9/16d53c8db0fad863d1ef3f35199777bff67fefa2c8f06b55811b96694f12/ai_microcore-3.8.0.tar.gz",
    "platform": null,
    "description": "<p align=\"right\">\n    <a href=\"https://github.com/Nayjest/ai-microcore/releases\" target=\"_blank\"><img src=\"https://img.shields.io/github/release/ai-microcore/microcore\" alt=\"Release Notes\"></a>\n    <a href=\"https://app.codacy.com/gh/Nayjest/ai-microcore/dashboard?utm_source=gh&utm_medium=referral&utm_content=&utm_campaign=Badge_grade\" target=\"_blank\"><img src=\"https://app.codacy.com/project/badge/Grade/441d03416bc048828c649129530dcbc3\" alt=\"Code Quality\"></a>\n    <a href=\"https://github.com/Nayjest/ai-microcore/actions/workflows/pylint.yml\" target=\"_blank\"><img src=\"https://github.com/Nayjest/ai-microcore/actions/workflows/pylint.yml/badge.svg\" alt=\"Pylint\"></a>\n    <a href=\"https://github.com/Nayjest/ai-microcore/actions/workflows/tests.yml\" target=\"_blank\"><img src=\"https://github.com/Nayjest/ai-microcore/actions/workflows/tests.yml/badge.svg\" alt=\"Tests\"></a>\n    <a href=\"https://github.com/Nayjest/ai-microcore/blob/main/LICENSE\" target=\"_blank\"><img src=\"https://img.shields.io/static/v1?label=license&message=MIT&color=d08aff\" alt=\"License\"></a>\n</p>\n\n\n# AI MicroCore: A Minimalistic Foundation for AI Applications\n\n**microcore** is a collection of python adapters for Large Language Models\nand Semantic Search APIs allowing to \ncommunicate with these services in a convenient way, make it easily switchable \nand separate business logic from implementation details.\n\nIt defines interfaces for features typically used in AI applications,\nthat allows you to keep your application as simple as possible and try various models & services\nwithout need to change your application code.\n\nYou even can switch between text completion and chat completion models only using configuration.\n\nThe basic example of usage is as follows:\n\n```python\nfrom microcore import llm\n\nwhile user_msg := input('Enter message: '):\n    print('AI: ' + llm(user_msg))\n```\n\n## \ud83d\udd17 Links\n\n -   [API Reference](https://ai-microcore.github.io/api-reference/)\n -   [PyPi Package](https://pypi.org/project/ai-microcore/)\n -   [GitHub Repository](https://github.com/Nayjest/ai-microcore)\n\n\n## \ud83d\udcbb Installation\n\nInstall as PyPi package:\n```\npip install ai-microcore\n```\n\nAlternatively, you may just copy `microcore` folder to your project sources root.\n```bash\ngit clone git@github.com:Nayjest/ai-microcore.git && mv ai-microcore/microcore ./ && rm -rf ai-microcore\n```\n\n\n## \ud83d\udccb Requirements\n\nPython 3.10+ / 3.11+\n\nBoth v0.28.X and v1.x.x OpenAI package versions are supported.\n\n\n## \u2699\ufe0f Configuring\n\n### Minimal Configuration\n\nHaving `OPENAI_API_KEY` in OS environment variables is enough for basic usage.\n\nSimilarity search features will work out of the box if you have the `chromadb` pip package installed.\n\n### Configuration Methods\n\nThere are a few options available for configuring microcore:\n\n-   Use `microcore.configure()`\n    <br>\ud83d\udca1 <small>All configuration options should be available in IDE autocompletion tooltips</small>\n-   Create a `.env` file in your project root; examples: [basic.env](https://github.com/Nayjest/ai-microcore/blob/main/.env.example), [Mistral Large.env](https://github.com/Nayjest/ai-microcore/blob/main/.env.mistral.example), [Anthropic Claude 3 Opus.env](https://github.com/Nayjest/ai-microcore/blob/main/.env.anthropic.example), [Gemini on Vertex AI.env](https://github.com/Nayjest/ai-microcore/blob/main/.env.google-vertex-gemini.example), [Gemini on AI Studio.env](https://github.com/Nayjest/ai-microcore/blob/main/.env.gemini.example)\n-   Use a custom configuration file: `mc.configure(DOT_ENV_FILE='dev-config.ini')`\n-   Define OS environment variables\n\nFor the full list of available configuration options, you may also check [`microcore/config.py`](https://github.com/Nayjest/ai-microcore/blob/main/microcore/config.py).\n\n### Installing vendor-specific packages\nFor the models working not via OpenAI API, you may need to install additional packages:\n#### Anthropic Claude 3\n```bash\npip install anthropic\n```\n#### Google Gemini via AI Studio\n```bash\npip install google-generativeai\n```\n#### Google Gemini via Vertex AI\n```bash\npip install vertexai\n```\n\ud83d\udcccAdditonaly for working through [Vertex AI](https://cloud.google.com/vertex-ai) you need to\n[install the Google Cloud CLI](https://cloud.google.com/sdk/docs/install)\nand [configure the authorization](https://cloud.google.com/sdk/docs/authorizing).\n\n### Priority of Configuration Sources\n\n1.  Configuration options passed as arguments to `microcore.configure()` have the highest priority.\n2.  The priority of configuration file options (`.env` by default or the value of `DOT_ENV_FILE`) is higher than OS environment variables.\n    <br>\ud83d\udca1 <small>Setting `USE_DOT_ENV` to `false` disables reading configuration files.</small>\n3.  OS environment variables have the lowest priority.\n\n\n## \ud83c\udf1f Core Functions\n\n### llm(prompt: str, \\*\\*kwargs) \u2192 str\n\nPerforms a request to a large language model (LLM).\n\nAsynchronous variant: `allm(prompt: str, **kwargs)`\n\n```python\nfrom microcore import *\n\n# Will print all requests and responses to console\nuse_logging()\n\n# Basic usage\nai_response = llm('What is your model name?')\n\n# You also may pass a list of strings as prompt\n# - For chat completion models elements are treated as separate messages\n# - For completion LLMs elements are treated as text lines\nllm(['1+2', '='])\nllm('1+2=', model='gpt-4')\n\n# To specify a message role, you can use dictionary or classes\nllm(dict(role='system', content='1+2='))\n# equivalent\nllm(SysMsg('1+2='))\n\n# The returned value is a string\nassert '7' == llm([\n SysMsg('You are a calculator'),\n UserMsg('1+2='),\n AssistantMsg('3'),\n UserMsg('3+4=')]\n).strip()\n\n# But it contains all fields of the LLM response in additional attributes\nfor i in llm('1+2=?', n=3, temperature=2).choices:\n    print('RESPONSE:', i.message.content)\n\n# To use response streaming you may specify the callback function:\nllm('Hi there', callback=lambda x: print(x, end=''))\n\n# Or multiple callbacks:\noutput = []\nllm('Hi there', callbacks=[\n    lambda x: print(x, end=''),\n    lambda x: output.append(x),\n])\n```\n\n### tpl(file_path, \\*\\*params) \u2192 str\nRenders prompt template with params.\n\nFull-featured Jinja2 templates are used by default.\n\nRelated configuration options:\n\n```python\nfrom microcore import configure\nconfigure(\n    # 'tpl' folder in current working directory by default\n    PROMPT_TEMPLATES_PATH = 'my_templates_folder'\n)\n```\n\n### texts.search(collection: str, query: str | list, n_results: int = 5, where: dict = None, **kwargs) \u2192 list[str]\nSimilarity search\n\n### texts.find_one(self, collection: str, query: str | list) \u2192 str | None\nFind most similar text\n\n### texts.get_all(self, collection: str) -> list[str]\nReturn collection of texts\n\n### texts.save(collection: str, text: str, metadata: dict = None))\nStore text and related metadata in embeddings database\n\n### texts.save_many(collection: str, items: list[tuple[str, dict] | str])\nStore multiple texts and related metadata in the embeddings database\n\n### texts.clear(collection: str):\nClear collection\n\n## API providers and models support\n\nLLM Microcore supports all models & API providers having OpenAI API.\n\n### List of API providers and models tested with LLM Microcore:\n\n| API Provider                                                                             |                                                                                                                                      Models |\n|------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------:|\n| [OpenAI](https://openai.com)                                                             |                                    All GPT-4 and GTP-3.5-Turbo models<br/>all text completion models (davinci, gpt-3.5-turbo-instruct, etc) |\n| [Microsoft Azure](https://azure.microsoft.com/en-us/products/ai-services/openai-service) |                                                                                                            All OpenAI models, Mistral Large |\n| [Anthropic](https://anthropic.com)                                                       |                                                                                                                             Claude 3 models |\n| [MistralAI](https://mistral.ai)                                                          |                                                                                                                          All Mistral models |\n| [Google AI Studio](https://aistudio.google.com/)                             |                                                                                                                        Google Gemini models |\n| [Google Vertex AI](https://cloud.google.com/vertex-ai?hl=en)                             |                                                   Gemini Pro & [other models](https://cloud.google.com/vertex-ai/docs/start/explore-models) |\n| [Deep Infra](https://deepinfra.com)                                                      | deepinfra/airoboros-70b<br/>jondurbin/airoboros-l2-70b-gpt4-1.4.1<br/>meta-llama/Llama-2-70b-chat-hf<br/>and other models having OpenAI API |\n| [Anyscale](https://anyscale.com)                                                         |                                           meta-llama/Llama-2-70b-chat-hf<br/>meta-llama/Llama-2-13b-chat-hf<br/>meta-llama/Llama-7b-chat-hf |\n| [Groq](https://groq.com/)                                                         |                                           LLaMA2 70b<br>Mixtral 8x7b<br>Gemma 7b |\n| [Fireworks](fireworks.ai)                                                         |                                           [Over 50 open-source language models](https://fireworks.ai/models?show=All) |\n\n\n## \ud83d\uddbc\ufe0f Examples\n\n#### [code-review-tool example](https://github.com/llm-microcore/microcore/blob/main/examples/code-review-tool)\nPerforms code review by LLM for changes in git .patch files in any programming languages.\n\n#### [Other examples](https://github.com/llm-microcore/microcore/tree/main/examples)\n\n## Python functions as AI tools\n\n@TODO\n\n## \ud83e\udd16 AI Modules\n**This is experimental feature.**\n\nTweaks the Python import system to provide automatic setup of MicroCore environment\nbased on metadata in module docstrings.\n### Usage:\n```python\nimport microcore.ai_modules\n```\n### Features:\n\n*   Automatically registers template folders of AI modules in Jinja2 environment\n\n\n## \ud83d\udee0\ufe0f Contributing\n\nPlease see [CONTRIBUTING](https://github.com/Nayjest/ai-microcore/blob/main/CONTRIBUTING.md) for details.\n\n\n## \ud83d\udcdd License\n\nLicensed under the [MIT License](https://github.com/Nayjest/ai-microcore/blob/main/LICENSE)\n\u00a9 2023 [Vitalii Stepanenko](mailto:mail@vitaliy.in)\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "# Minimalistic Foundation for AI Applications",
    "version": "3.8.0",
    "project_urls": {
        "Source Code": "https://github.com/ai-microcore/microcore"
    },
    "split_keywords": [
        "llm",
        " large language models",
        " ai",
        " similarity search",
        " ai search",
        " gpt",
        " openai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "98ec4d2b5b2ac6ab4c65ae908dc727ab0669afd583388c775c59002951efcabb",
                "md5": "16bc080e132c3a36201b19b722fe397c",
                "sha256": "5b6db28f22c5bd69416607182c99b476e8c18245b5c76e8403db593fe26aa0da"
            },
            "downloads": -1,
            "filename": "ai_microcore-3.8.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "16bc080e132c3a36201b19b722fe397c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 45067,
            "upload_time": "2024-04-27T13:19:17",
            "upload_time_iso_8601": "2024-04-27T13:19:17.214846Z",
            "url": "https://files.pythonhosted.org/packages/98/ec/4d2b5b2ac6ab4c65ae908dc727ab0669afd583388c775c59002951efcabb/ai_microcore-3.8.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fbc916d53c8db0fad863d1ef3f35199777bff67fefa2c8f06b55811b96694f12",
                "md5": "d09182a71eae6c67637ca4cbb3b50f1b",
                "sha256": "772991acbcf55f6962da976bd80ea8f37144456c3dbe3bd89e4e50e2a77779a2"
            },
            "downloads": -1,
            "filename": "ai_microcore-3.8.0.tar.gz",
            "has_sig": false,
            "md5_digest": "d09182a71eae6c67637ca4cbb3b50f1b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 36938,
            "upload_time": "2024-04-27T13:19:32",
            "upload_time_iso_8601": "2024-04-27T13:19:32.729095Z",
            "url": "https://files.pythonhosted.org/packages/fb/c9/16d53c8db0fad863d1ef3f35199777bff67fefa2c8f06b55811b96694f12/ai_microcore-3.8.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-27 13:19:32",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ai-microcore",
    "github_project": "microcore",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "ai-microcore"
}
        
Elapsed time: 0.24073s