<div style="text-align: left; margin-bottom: 20px;">
<img src="https://kiss-ai-stack.github.io/kissaistack.svg" alt="KISS AI Stack Banner" style="max-width: auto; height: 250px">
</div>
# KISS AI Stack - Core
**Effortless AI Stack Building**
Welcome to the core of the **KISS AI Stack**! This module helps you build a stack effortlessly using a simple YAML configuration file. Say goodbye to boilerplate code and embrace minimalism with the **KISS principle** (Keep It Simple, Stupid).
---
## Features
- **Centralized Stack Management**: Manage multiple session-based AI stacks with lifecycle support.
- **Minimal Dependencies**: Built using simple, vanilla vendor libraries.
- **Tool Classification**: Configure tools for your stack to handle specific tasks easily.
- **Supports RAG and Prompt-Based Models**: Choose the model type that suits your needs.
- **Thread-Safe**: Reliable operation in multi-threaded environments.
---
## Installation
Install the core module using pip:
```bash
pip install kiss-ai-stack-core
```
---
## Example Configuration
Here’s an example YAML configuration to set up an AI stack with different tools:
```yaml
stack:
decision_maker: # Required for tool classification
name: decision_maker
role: classify tools for given queries
kind: prompt # Choose from 'rag' or 'prompt'
ai_client:
provider: openai
model: gpt-4
api_key: <your-api-key>
tools:
- name: general_queries
role: process other queries if no suitable tool is found.
kind: prompt
ai_client:
provider: openai
model: gpt-4
api_key: <your-api-key>
- name: document_tool
role: process documents and provide answers based on them.
kind: rag # Retrieval-Augmented Generation
embeddings: text-embedding-ada-002
ai_client:
provider: openai
model: gpt-4
api_key: <your-api-key>
vector_db:
provider: chroma
kind: remote # Choose in-memory, storage, or remote options.
host: 0.0.0.0
port: 8000
secure: false
```
---
## Example Python Usage
Use the core module to build and interact with your AI stack:
```python
from kiss_ai_stack import Stacks
async def main():
try:
# Initialize a stack in the stack
await Stacks.bootstrap_stack(stack_id="my_stack", temporary=True)
# Process a query
response = await Stacks.generate_answer(stack_id="my_stack", query="What is Retrieval-Augmented Generation?")
print(response.answer)
except Exception as ex:
print(f"An error occurred: {ex}")
# Run the example
import asyncio
asyncio.run(main())
```
---
## How It Works
1. **Stack Initialization**: Use `Stack.bootstrap_stack` to initialize stacks with their configuration and resources.
2. **Query Processing**: Process queries with `Stack.generate_answer`, leveraging tools and AI clients defined in the YAML configuration.
3. **Tool Management**: Define tools to handle specific tasks like document processing or query classification.
4. **Vector Database**: Use the `vector_db` section to define how document embeddings are stored and retrieved for RAG-based tasks. Currently, only `Chroma` is supported.
---
## Documentation
### Key Methods
- `bootstrap_stack(stack_id: str, temporary: bool)`: Initialize a new stack session.
- `generate_answer(stack_id: str, query: Union[str, Dict, List])`: Process a query and return a response.
### Configuration Highlights
- **AI Client**: Configure the provider, model, and API key for supported services like OpenAI.
- **Tools**: Define tools such as general-purpose query handlers or document processors.
- **Vector Database**: Set up in-memory or persistent storage for RAG-based tasks.
---
## Contributing
We welcome contributions! Submit pull requests or open issues to improve this stack.
---
## License
This project is licensed under the MIT License. See the [LICENSE](./LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "kiss-ai-stack-core",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.12",
"maintainer_email": null,
"keywords": "ai, stack, rag, prompt, yaml, machine-learning, llm, document-processing",
"author": "KISS AI Stack, Lahiru Pathirage",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/2b/80/f166d02aa9e8f5815594a6e273f6a7c102d6539a360b7caf4882d1268106/kiss_ai_stack_core-0.1.0a28.tar.gz",
"platform": null,
"description": "\n<div style=\"text-align: left; margin-bottom: 20px;\">\n <img src=\"https://kiss-ai-stack.github.io/kissaistack.svg\" alt=\"KISS AI Stack Banner\" style=\"max-width: auto; height: 250px\">\n</div>\n\n# KISS AI Stack - Core\n\n**Effortless AI Stack Building**\n\nWelcome to the core of the **KISS AI Stack**! This module helps you build a stack effortlessly using a simple YAML configuration file. Say goodbye to boilerplate code and embrace minimalism with the **KISS principle** (Keep It Simple, Stupid).\n\n---\n\n## Features\n\n- **Centralized Stack Management**: Manage multiple session-based AI stacks with lifecycle support.\n- **Minimal Dependencies**: Built using simple, vanilla vendor libraries.\n- **Tool Classification**: Configure tools for your stack to handle specific tasks easily.\n- **Supports RAG and Prompt-Based Models**: Choose the model type that suits your needs.\n- **Thread-Safe**: Reliable operation in multi-threaded environments.\n\n---\n\n## Installation\n\nInstall the core module using pip:\n\n```bash\npip install kiss-ai-stack-core\n```\n\n---\n\n## Example Configuration\n\nHere\u2019s an example YAML configuration to set up an AI stack with different tools:\n\n```yaml\nstack:\n decision_maker: # Required for tool classification\n name: decision_maker\n role: classify tools for given queries\n kind: prompt # Choose from 'rag' or 'prompt'\n ai_client:\n provider: openai\n model: gpt-4\n api_key: <your-api-key>\n\n tools:\n - name: general_queries\n role: process other queries if no suitable tool is found.\n kind: prompt\n ai_client:\n provider: openai\n model: gpt-4\n api_key: <your-api-key>\n\n - name: document_tool\n role: process documents and provide answers based on them.\n kind: rag # Retrieval-Augmented Generation\n embeddings: text-embedding-ada-002\n ai_client:\n provider: openai\n model: gpt-4\n api_key: <your-api-key>\n\n vector_db:\n provider: chroma\n kind: remote # Choose in-memory, storage, or remote options.\n host: 0.0.0.0\n port: 8000\n secure: false\n```\n\n---\n\n## Example Python Usage\n\nUse the core module to build and interact with your AI stack:\n\n```python\nfrom kiss_ai_stack import Stacks\n\nasync def main():\n try:\n # Initialize a stack in the stack\n await Stacks.bootstrap_stack(stack_id=\"my_stack\", temporary=True)\n\n # Process a query\n response = await Stacks.generate_answer(stack_id=\"my_stack\", query=\"What is Retrieval-Augmented Generation?\")\n print(response.answer)\n\n except Exception as ex:\n print(f\"An error occurred: {ex}\")\n\n# Run the example\nimport asyncio\nasyncio.run(main())\n```\n\n---\n\n## How It Works\n\n1. **Stack Initialization**: Use `Stack.bootstrap_stack` to initialize stacks with their configuration and resources.\n2. **Query Processing**: Process queries with `Stack.generate_answer`, leveraging tools and AI clients defined in the YAML configuration.\n3. **Tool Management**: Define tools to handle specific tasks like document processing or query classification.\n4. **Vector Database**: Use the `vector_db` section to define how document embeddings are stored and retrieved for RAG-based tasks. Currently, only `Chroma` is supported.\n\n---\n\n## Documentation\n\n### Key Methods\n\n- `bootstrap_stack(stack_id: str, temporary: bool)`: Initialize a new stack session.\n- `generate_answer(stack_id: str, query: Union[str, Dict, List])`: Process a query and return a response.\n\n### Configuration Highlights\n\n- **AI Client**: Configure the provider, model, and API key for supported services like OpenAI.\n- **Tools**: Define tools such as general-purpose query handlers or document processors.\n- **Vector Database**: Set up in-memory or persistent storage for RAG-based tasks.\n\n---\n\n## Contributing\n\nWe welcome contributions! Submit pull requests or open issues to improve this stack.\n\n---\n\n## License\n\nThis project is licensed under the MIT License. See the [LICENSE](./LICENSE) file for details.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "KISS AI Stack's RAG builder core",
"version": "0.1.0a28",
"project_urls": {
"Documentation": "https://github.com/kiss-ai-stack/kiss-ai-stack-core/main/README.md",
"Homepage": "https://github.com/kiss-ai-stack",
"Repository": "https://github.com/kiss-ai-stack"
},
"split_keywords": [
"ai",
" stack",
" rag",
" prompt",
" yaml",
" machine-learning",
" llm",
" document-processing"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "edf05094ada3618ee302812cfc9f3e7b50c3af1187128eea35e70f334453efef",
"md5": "580fa3887b235cdb22e44b0faac825be",
"sha256": "4316b8a573a94badd3fad8a651f1ef06a8205b5017078b6a482d194b7841b231"
},
"downloads": -1,
"filename": "kiss_ai_stack_core-0.1.0a28-py3-none-any.whl",
"has_sig": false,
"md5_digest": "580fa3887b235cdb22e44b0faac825be",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.12",
"size": 33393,
"upload_time": "2024-12-21T17:08:15",
"upload_time_iso_8601": "2024-12-21T17:08:15.935165Z",
"url": "https://files.pythonhosted.org/packages/ed/f0/5094ada3618ee302812cfc9f3e7b50c3af1187128eea35e70f334453efef/kiss_ai_stack_core-0.1.0a28-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "2b80f166d02aa9e8f5815594a6e273f6a7c102d6539a360b7caf4882d1268106",
"md5": "3fd6fe04fa1ee5d14992b272531b9e5d",
"sha256": "85f3438f597cee772aaf643ea5f702b1c56d34099530670c139fb6fa895bd752"
},
"downloads": -1,
"filename": "kiss_ai_stack_core-0.1.0a28.tar.gz",
"has_sig": false,
"md5_digest": "3fd6fe04fa1ee5d14992b272531b9e5d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.12",
"size": 24745,
"upload_time": "2024-12-21T17:08:18",
"upload_time_iso_8601": "2024-12-21T17:08:18.500252Z",
"url": "https://files.pythonhosted.org/packages/2b/80/f166d02aa9e8f5815594a6e273f6a7c102d6539a360b7caf4882d1268106/kiss_ai_stack_core-0.1.0a28.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-21 17:08:18",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "kiss-ai-stack",
"github_project": "kiss-ai-stack-core",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "aiofiles",
"specs": [
[
"~=",
"24.1.0"
]
]
},
{
"name": "asyncio",
"specs": [
[
"~=",
"3.4.3"
]
]
},
{
"name": "PyYAML",
"specs": [
[
"~=",
"6.0.2"
]
]
},
{
"name": "pydantic",
"specs": [
[
"~=",
"2.10.3"
]
]
},
{
"name": "setuptools",
"specs": [
[
"~=",
"75.6.0"
]
]
},
{
"name": "unstructured",
"specs": [
[
"~=",
"0.16.8"
]
]
},
{
"name": "unstructured",
"specs": [
[
"~=",
"0.16.8"
]
]
},
{
"name": "unstructured",
"specs": [
[
"~=",
"0.16.8"
]
]
},
{
"name": "unstructured",
"specs": [
[
"~=",
"0.16.8"
]
]
},
{
"name": "unstructured",
"specs": [
[
"~=",
"0.16.8"
]
]
},
{
"name": "tiktoken",
"specs": [
[
"~=",
"0.8.0"
]
]
},
{
"name": "tokenizers",
"specs": [
[
"~=",
"0.20.3"
]
]
},
{
"name": "pandas",
"specs": [
[
"~=",
"2.2.3"
]
]
},
{
"name": "numpy",
"specs": [
[
"~=",
"1.26.4"
]
]
}
],
"lcname": "kiss-ai-stack-core"
}