# Assistants Framework
Welcome to the AI Assistants Framework! This repository contains the foundational code for creating versatile AI assistants capable of interacting through various front-end interfaces and utilizing interchangeable data layers. The goal is to create a powerful yet flexible assistants framework that can adapt to different user needs and environments.
## Table of Contents
- [Features](#features)
- [Installation](#installation)
- [Usage](#usage)
- [Environment Variables](#environment-variables)
- [Contributing](#contributing)
- [License](#license)
## Features
- **Multi-Front-End Support**: The AI assistant (configured via environment variables) can interact through different user interfaces, including CLI and Telegram.
- **User Data Management**: Efficient handling of user data with a robust backend.
- **Interchangeable Data Layers**: Easily swap out the underlying data storage solutions, such as SQLite or other databases (coming soon).
- **Extensible Architecture**: Built with modularity in mind, allowing for easy addition of new features and integrations.
## Installation
To get started with the AI Assistant Project, follow these steps:
- \[Optional\] Create a Python virtual environment (recommended, but not required on most systems) (Requires Python 3.10+)
(a simple way is to use the built-in `venv` module, e.g., `python -m venv my-venv; source my-venv/bin/activate`)
- Install the package using pip:
```bash
pip install assistants-framework
```
You can then run the following command to start the CLI:
```bash
$ ai-cli
```
NOTE: if your virtual environment is not activated, you may need to use /path/to/venv/bin/ai-cli instead of just ai-cli. Consider adding the virtual environment's bin directory to your PATH or otherwise linking the executable to a location in your PATH or creating an alias.
If you wish to use the Telegram bot interface, you can install the additional dependencies:
```bash
pip install assistants-framework[telegram]
```
## Usage
### Command Line Interface
For help running the assistant through the CLI, simply run:
```
$ ai-cli --help
usage: ai-cli [-h] [-e] [-f INPUT_FILE] [-i INSTRUCTIONS_FILE] [-t] [-C] [prompt ...]
CLI for AI Assistant
positional arguments:
prompt Positional arguments concatenate into a single prompt. E.g. `ai-cli
Is this a single prompt\?` (question mark escaped) ...will be passed
to the program as a single string (without the backslash). You can
also use quotes to pass a single argument with spaces and special
characters. See the -e and -f options for more advanced prompt
options.
options:
-h, --help show this help message and exit
-e, --editor Open the default editor to compose a prompt.
-f INPUT_FILE, --input-file INPUT_FILE
Read the initial prompt from a file (e.g., 'input.txt').
-i INSTRUCTIONS_FILE, --instructions INSTRUCTIONS_FILE
Read the initial instructions (system message) from a specified
file; if not provided, environment variable `ASSISTANT_INSTRUCTIONS`
or defaults will be used.
-t, --continue-thread
Continue previous thread. (not currently possible with `-C` option)
-C, --code Use specialised reasoning/code model. WARNING: This model will be
slower and more expensive to use.
```
### Telegram Bot
To run the telegram bot polling loop, you can just use the following command:
```bash
$ ai-tg-bot
```
You can customize the behavior of the assistant by modifying the `ASSISTANT_INSTRUCTIONS` environment variable, which defaults to `"You are a helpful assistant."`
## Environment Variables
In addition to `ASSISTANT_INSTRUCTIONS`, other environment variables that can be configured include:
- `ASSISTANTS_API_KEY_NAME` - The name of the API key environment variable to use for authentication (defaults to `OPENAI_API_KEY`) - remember to also set the corresponding API key value to the environment variable you choose (or the default).
- `DEFAULT_MODEL` - The default model to use for OpenAI API requests (defaults to `gpt-4o-mini`)
- `CODE_MODEL` - more advanced reasoning model to use for OpenAI API requests (defaults to `o1-mini`)
- `ASSISTANTS_DATA_DIR` - The directory to store user data (defaults to `~/.local/share/assistants`)
- `ASSISTANTS_CONFIG_DIR` - The directory to store configuration files (defaults to `~/.config/assistants`)
- `TG_BOT_TOKEN` - The Telegram bot token if using the Telegram UI
## Contributing
Contributions are welcome! If you have suggestions for improvements, please feel free to submit a pull request or open an issue.
1. Fork the repository.
2. Commit your changes.
3. Open a pull request.
See the dev dependencies in the dev_requirements.txt file for formatting and linting tools.
#### TODOS:
- optional local threads API built on top of langchain
- add postgresql support for data layer
- add support for more models/APIs
## License
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
---
Thank you for checking out the AI Assistant Project! I hope you find it useful and inspiring. Check out the examples directory to see the assistant in action!
Raw data
{
"_id": null,
"home_page": "https://github.com/nihilok/assistants",
"name": "assistants-framework",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "openai gpt3 gpt3.5 gpt4 o1 chatgpt chatbot assistant assistants cli telegram llm bot ui tui coding-assistant",
"author": "Michael Jarvis",
"author_email": "nihilok@jarv.dev",
"download_url": "https://files.pythonhosted.org/packages/12/c0/cb27430a40faabded34f82da79ce711111f4d80e61901452588c9288c91d/assistants_framework-0.2.0.tar.gz",
"platform": null,
"description": "# Assistants Framework\n\nWelcome to the AI Assistants Framework! This repository contains the foundational code for creating versatile AI assistants capable of interacting through various front-end interfaces and utilizing interchangeable data layers. The goal is to create a powerful yet flexible assistants framework that can adapt to different user needs and environments.\n\n## Table of Contents\n\n- [Features](#features)\n- [Installation](#installation)\n- [Usage](#usage)\n- [Environment Variables](#environment-variables)\n- [Contributing](#contributing)\n- [License](#license)\n\n## Features\n\n- **Multi-Front-End Support**: The AI assistant (configured via environment variables) can interact through different user interfaces, including CLI and Telegram.\n- **User Data Management**: Efficient handling of user data with a robust backend.\n- **Interchangeable Data Layers**: Easily swap out the underlying data storage solutions, such as SQLite or other databases (coming soon).\n- **Extensible Architecture**: Built with modularity in mind, allowing for easy addition of new features and integrations.\n\n## Installation\n\nTo get started with the AI Assistant Project, follow these steps:\n\n- \\[Optional\\] Create a Python virtual environment (recommended, but not required on most systems) (Requires Python 3.10+)\n\n(a simple way is to use the built-in `venv` module, e.g., `python -m venv my-venv; source my-venv/bin/activate`)\n\n- Install the package using pip:\n\n```bash\npip install assistants-framework\n```\n\nYou can then run the following command to start the CLI:\n\n```bash\n$ ai-cli\n```\n\nNOTE: if your virtual environment is not activated, you may need to use /path/to/venv/bin/ai-cli instead of just ai-cli. Consider adding the virtual environment's bin directory to your PATH or otherwise linking the executable to a location in your PATH or creating an alias.\n\nIf you wish to use the Telegram bot interface, you can install the additional dependencies:\n\n```bash\npip install assistants-framework[telegram]\n```\n\n## Usage\n\n### Command Line Interface\n\nFor help running the assistant through the CLI, simply run:\n\n```\n$ ai-cli --help\nusage: ai-cli [-h] [-e] [-f INPUT_FILE] [-i INSTRUCTIONS_FILE] [-t] [-C] [prompt ...]\n\nCLI for AI Assistant\n\npositional arguments:\n prompt Positional arguments concatenate into a single prompt. E.g. `ai-cli\n Is this a single prompt\\?` (question mark escaped) ...will be passed\n to the program as a single string (without the backslash). You can\n also use quotes to pass a single argument with spaces and special\n characters. See the -e and -f options for more advanced prompt\n options.\n\noptions:\n -h, --help show this help message and exit\n -e, --editor Open the default editor to compose a prompt.\n -f INPUT_FILE, --input-file INPUT_FILE\n Read the initial prompt from a file (e.g., 'input.txt').\n -i INSTRUCTIONS_FILE, --instructions INSTRUCTIONS_FILE\n Read the initial instructions (system message) from a specified\n file; if not provided, environment variable `ASSISTANT_INSTRUCTIONS`\n or defaults will be used.\n -t, --continue-thread\n Continue previous thread. (not currently possible with `-C` option)\n -C, --code Use specialised reasoning/code model. WARNING: This model will be\n slower and more expensive to use.\n```\n\n\n### Telegram Bot\n\nTo run the telegram bot polling loop, you can just use the following command:\n\n```bash\n$ ai-tg-bot\n```\n\nYou can customize the behavior of the assistant by modifying the `ASSISTANT_INSTRUCTIONS` environment variable, which defaults to `\"You are a helpful assistant.\"`\n\n## Environment Variables\n\nIn addition to `ASSISTANT_INSTRUCTIONS`, other environment variables that can be configured include:\n\n- `ASSISTANTS_API_KEY_NAME` - The name of the API key environment variable to use for authentication (defaults to `OPENAI_API_KEY`) - remember to also set the corresponding API key value to the environment variable you choose (or the default).\n- `DEFAULT_MODEL` - The default model to use for OpenAI API requests (defaults to `gpt-4o-mini`)\n- `CODE_MODEL` - more advanced reasoning model to use for OpenAI API requests (defaults to `o1-mini`)\n- `ASSISTANTS_DATA_DIR` - The directory to store user data (defaults to `~/.local/share/assistants`)\n- `ASSISTANTS_CONFIG_DIR` - The directory to store configuration files (defaults to `~/.config/assistants`)\n- `TG_BOT_TOKEN` - The Telegram bot token if using the Telegram UI\n\n## Contributing\n\nContributions are welcome! If you have suggestions for improvements, please feel free to submit a pull request or open an issue.\n\n1. Fork the repository.\n2. Commit your changes.\n3. Open a pull request.\n\nSee the dev dependencies in the dev_requirements.txt file for formatting and linting tools.\n\n#### TODOS: \n\n- optional local threads API built on top of langchain\n- add postgresql support for data layer\n- add support for more models/APIs\n\n## License\n\nThis project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.\n\n---\n\nThank you for checking out the AI Assistant Project! I hope you find it useful and inspiring. Check out the examples directory to see the assistant in action!\n",
"bugtrack_url": null,
"license": null,
"summary": "OpenAI Assistants Wrapper, with CLI and Telegram Bot",
"version": "0.2.0",
"project_urls": {
"Homepage": "https://github.com/nihilok/assistants"
},
"split_keywords": [
"openai",
"gpt3",
"gpt3.5",
"gpt4",
"o1",
"chatgpt",
"chatbot",
"assistant",
"assistants",
"cli",
"telegram",
"llm",
"bot",
"ui",
"tui",
"coding-assistant"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "5afa258269027cf6560d87d77da2f659d043c8fcddcb4fd0be637a952920f287",
"md5": "7b03e5b2ab7dd08f7cd9c7dcf3a1c298",
"sha256": "2e7d627dd36ee5c9ac8847750ad8d1dac36fbabc4c92c486b5aa0c656a7eda14"
},
"downloads": -1,
"filename": "assistants_framework-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7b03e5b2ab7dd08f7cd9c7dcf3a1c298",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 26070,
"upload_time": "2025-01-05T23:30:46",
"upload_time_iso_8601": "2025-01-05T23:30:46.517999Z",
"url": "https://files.pythonhosted.org/packages/5a/fa/258269027cf6560d87d77da2f659d043c8fcddcb4fd0be637a952920f287/assistants_framework-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "12c0cb27430a40faabded34f82da79ce711111f4d80e61901452588c9288c91d",
"md5": "2ae1022cb220d5b2d35172d818d8fa5e",
"sha256": "b5479f9e8945fc9361fe0faf58f045a33d35092226cd691cc74d32397f6fa67b"
},
"downloads": -1,
"filename": "assistants_framework-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "2ae1022cb220d5b2d35172d818d8fa5e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 20591,
"upload_time": "2025-01-05T23:30:48",
"upload_time_iso_8601": "2025-01-05T23:30:48.809031Z",
"url": "https://files.pythonhosted.org/packages/12/c0/cb27430a40faabded34f82da79ce711111f4d80e61901452588c9288c91d/assistants_framework-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-01-05 23:30:48",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "nihilok",
"github_project": "assistants",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "openai",
"specs": [
[
"==",
"1.59.3"
]
]
},
{
"name": "python-telegram-bot",
"specs": [
[
"==",
"21.10"
]
]
},
{
"name": "aiosqlite",
"specs": [
[
"==",
"0.20.0"
]
]
},
{
"name": "loguru",
"specs": [
[
"==",
"0.7.3"
]
]
},
{
"name": "pyperclip",
"specs": [
[
"==",
"1.9.0"
]
]
},
{
"name": "pytest",
"specs": [
[
"==",
"8.3.4"
]
]
},
{
"name": "prompt-toolkit",
"specs": [
[
"==",
"3.0.48"
]
]
},
{
"name": "pygments",
"specs": [
[
"==",
"2.18.0"
]
]
},
{
"name": "aiohttp",
"specs": [
[
"==",
"3.11.11"
]
]
}
],
"lcname": "assistants-framework"
}