[English](README.md) | [简体中文](README_zh_CN.md)
# Myla: MY Local Assistants
## Self-hosting AI Assistants compatible with OpenAI
Myla stands for MY Local Assistants and is designed and optimized for deploying AI assistants based on large language models (LLMs) in a private environment. Myla provides an API compatible with the **OpenAI assistants API**, with support for multiple LLM backends. Whether on a laptop or a production server, you can quickly develop and run AI assistants.
## Key Features:
* Support for OpenAI API and compatible LLM services
* Assistant API compatible with OpenAI
* Vector retrieval (FAISS/LanceDB)
* sentence_transformers
* WebUI
* Tool extensions
* Document Q&A (in progress)
## Quick Start
### Installation
Python version requirement: >= 3.9
Myla can be installed from PyPI using `pip`. It is recommended to create a new virtual environment before installation to avoid conflicts.
```bash
pip install myla
```
If you need Retrieval, please install all dependencies:
```bash
pip install "myla[all]"
```
### Configuration
Myla supports using an OpenAI API-compatible LLM service as the backend. You can use the OpenAI API directly or deploy your own local LLM. If you want to deploy a local LLM, it is recommended to use [Xorbits Inference](https://github.com/xorbitsai/inference).
Create a `.env` file in the current directory with the following content:
```
# LLM configuration
LLM_ENDPOINT=https://api.openai.com/v1/
LLM_API_KEY=sk-xx
DEFAULT_LLM_MODEL_NAME=gpt-3.5-turbo
```
More configurations can be found in: [env-example.txt](env-example.txt)
#### ChatGLM as backend for your MacBook
Myla supports running ChatGLM locally using `chatglm.cpp` as the backend. To install the Python Binding, refer to: https://github.com/li-plus/chatglm.cpp#python-binding
`.env` configuration example:
```
DEFAULT_LLM_MODEL_NAME=chatglm@/Users/shellc/Workspaces/chatglm.cpp/chatglm-ggml.bin
```
### Start
```bash
myla
```
or
```bash
python -m myla
```
For more startup options:
```bash
myla --help
```
### WebUI
Myla provides a simple web interface that makes it easy to develop and debug assistants.
Access from your browser: http://localhost:2000/
![Screenshot](myla/webui/static/images/screenshot.png)
### API
You can directly use the OpenAI python SDK to access the assistants API.
* API Docs: http://localhost:2000/api/docs
* Swagger: http://localhost:2000/api/swagger
## Community
Myla is still under rapid development, and community contributions are welcome.
Raw data
{
"_id": null,
"home_page": "https://github.com/muyuworks/myla",
"name": "myla",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "AI assistant, LLM, Myla, chatbot",
"author": "shellc",
"author_email": "shenggong.wang@gmail.com",
"download_url": null,
"platform": null,
"description": "[English](README.md) | [\u7b80\u4f53\u4e2d\u6587](README_zh_CN.md)\n\n# Myla: MY Local Assistants\n\n## Self-hosting AI Assistants compatible with OpenAI\n\nMyla stands for MY Local Assistants and is designed and optimized for deploying AI assistants based on large language models (LLMs) in a private environment. Myla provides an API compatible with the **OpenAI assistants API**, with support for multiple LLM backends. Whether on a laptop or a production server, you can quickly develop and run AI assistants.\n\n## Key Features:\n\n* Support for OpenAI API and compatible LLM services\n* Assistant API compatible with OpenAI\n* Vector retrieval (FAISS/LanceDB)\n* sentence_transformers\n* WebUI\n* Tool extensions\n* Document Q&A (in progress)\n\n## Quick Start\n### Installation\n\nPython version requirement: >= 3.9\n\nMyla can be installed from PyPI using `pip`. It is recommended to create a new virtual environment before installation to avoid conflicts.\n\n```bash\npip install myla\n```\n\nIf you need Retrieval, please install all dependencies:\n```bash\npip install \"myla[all]\"\n```\n\n### Configuration\n\nMyla supports using an OpenAI API-compatible LLM service as the backend. You can use the OpenAI API directly or deploy your own local LLM. If you want to deploy a local LLM, it is recommended to use [Xorbits Inference](https://github.com/xorbitsai/inference).\n\nCreate a `.env` file in the current directory with the following content:\n\n```\n# LLM configuration\nLLM_ENDPOINT=https://api.openai.com/v1/\nLLM_API_KEY=sk-xx\nDEFAULT_LLM_MODEL_NAME=gpt-3.5-turbo\n```\n\nMore configurations can be found in: [env-example.txt](env-example.txt)\n\n#### ChatGLM as backend for your MacBook\n\nMyla supports running ChatGLM locally using `chatglm.cpp` as the backend. To install the Python Binding, refer to: https://github.com/li-plus/chatglm.cpp#python-binding \n\n`.env` configuration example:\n\n```\nDEFAULT_LLM_MODEL_NAME=chatglm@/Users/shellc/Workspaces/chatglm.cpp/chatglm-ggml.bin\n```\n\n### Start\n\n```bash\nmyla\n```\n\nor\n\n```bash\npython -m myla\n```\n\nFor more startup options:\n```bash\nmyla --help\n```\n\n### WebUI\n\nMyla provides a simple web interface that makes it easy to develop and debug assistants.\n\nAccess from your browser: http://localhost:2000/\n\n![Screenshot](myla/webui/static/images/screenshot.png)\n\n### API\n\nYou can directly use the OpenAI python SDK to access the assistants API.\n\n* API Docs: http://localhost:2000/api/docs\n* Swagger: http://localhost:2000/api/swagger\n\n\n## Community\n\nMyla is still under rapid development, and community contributions are welcome.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A local implementation of OpenAI Assistants API: Myla stands for MY Local Assistant",
"version": "0.2.40",
"project_urls": {
"Homepage": "https://github.com/muyuworks/myla"
},
"split_keywords": [
"ai assistant",
" llm",
" myla",
" chatbot"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "363e2ba7b674cab48b73023619fd43be82376615d55a913bc11a00cec03b4840",
"md5": "9ce735d91939759295d5fffc00799b0e",
"sha256": "daf541cce63fbe303de8cee96d1c1994ecd7dcc1fd9ba883f191b95835ba599b"
},
"downloads": -1,
"filename": "myla-0.2.40-py3-none-any.whl",
"has_sig": false,
"md5_digest": "9ce735d91939759295d5fffc00799b0e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 588467,
"upload_time": "2024-10-11T03:15:36",
"upload_time_iso_8601": "2024-10-11T03:15:36.770898Z",
"url": "https://files.pythonhosted.org/packages/36/3e/2ba7b674cab48b73023619fd43be82376615d55a913bc11a00cec03b4840/myla-0.2.40-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-11 03:15:36",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "muyuworks",
"github_project": "myla",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "uvicorn",
"specs": []
},
{
"name": "starlette",
"specs": []
},
{
"name": "fastapi",
"specs": []
},
{
"name": "python-dotenv",
"specs": []
},
{
"name": "jinja2",
"specs": []
},
{
"name": "aiohttp",
"specs": []
},
{
"name": "sqlmodel",
"specs": []
},
{
"name": "openai",
"specs": []
},
{
"name": "python-multipart",
"specs": []
},
{
"name": "pandas",
"specs": []
},
{
"name": "aiofiles",
"specs": []
},
{
"name": "sentence_transformers",
"specs": []
},
{
"name": "faiss-cpu",
"specs": []
},
{
"name": "Authlib",
"specs": []
}
],
"lcname": "myla"
}