myla


Namemyla JSON
Version 0.2.24 PyPI version JSON
download
home_pagehttps://github.com/muyuworks/myla
SummaryA local implementation of OpenAI Assistants API: Myla stands for MY Local Assistant
upload_time2024-04-11 09:21:27
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords ai assistant llm myla chatbot
VCS
bugtrack_url
requirements uvicorn starlette fastapi python-dotenv jinja2 aiohttp sqlmodel openai python-multipart pandas aiofiles
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [English](README.md) | [简体中文](README_zh_CN.md)

# Myla: MY Local Assistants

## Self-hosting AI Assistants compatible with OpenAI

Myla stands for MY Local Assistants and is designed and optimized for deploying AI assistants based on large language models (LLMs) in a private environment. Myla provides an API compatible with the **OpenAI assistants API**, with support for multiple LLM backends. Whether on a laptop or a production server, you can quickly develop and run AI assistants.

## Key Features:

* Support for OpenAI API and compatible LLM services
* Assistant API compatible with OpenAI
* Vector retrieval (FAISS/LanceDB)
* sentence_transformers
* WebUI
* Tool extensions
* Document Q&A (in progress)

## Quick Start
### Installation

Python version requirement: >= 3.9

Myla can be installed from PyPI using `pip`. It is recommended to create a new virtual environment before installation to avoid conflicts.

```bash
pip install myla
```

If you need Retrieval, please install all dependencies:
```bash
pip install "myla[all]"
```

### Configuration

Myla supports using an OpenAI API-compatible LLM service as the backend. You can use the OpenAI API directly or deploy your own local LLM. If you want to deploy a local LLM, it is recommended to use [Xorbits Inference](https://github.com/xorbitsai/inference).

Create a `.env` file in the current directory with the following content:

```
# LLM configuration
LLM_ENDPOINT=https://api.openai.com/v1/
LLM_API_KEY=sk-xx
DEFAULT_LLM_MODEL_NAME=gpt-3.5-turbo
```

More configurations can be found in: [env-example.txt](env-example.txt)

#### ChatGLM as backend for your MacBook

Myla supports running ChatGLM locally using `chatglm.cpp` as the backend. To install the Python Binding, refer to: https://github.com/li-plus/chatglm.cpp#python-binding 

`.env` configuration example:

```
DEFAULT_LLM_MODEL_NAME=chatglm@/Users/shellc/Workspaces/chatglm.cpp/chatglm-ggml.bin
```

### Start

```bash
myla
```

or

```bash
python -m myla
```

For more startup options:
```bash
myla --help
```

### WebUI

Myla provides a simple web interface that makes it easy to develop and debug assistants.

Access from your browser: http://localhost:2000/

![Screenshot](myla/webui/static/images/screenshot.png)

### API

You can directly use the OpenAI python SDK to access the assistants API.

* API Docs: http://localhost:2000/api/docs
* Swagger: http://localhost:2000/api/swagger


## Community

Myla is still under rapid development, and community contributions are welcome.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/muyuworks/myla",
    "name": "myla",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "AI assistant, LLM, Myla, chatbot",
    "author": null,
    "author_email": "shenggong.wang@gmail.com",
    "download_url": null,
    "platform": null,
    "description": "[English](README.md) | [\u7b80\u4f53\u4e2d\u6587](README_zh_CN.md)\n\n# Myla: MY Local Assistants\n\n## Self-hosting AI Assistants compatible with OpenAI\n\nMyla stands for MY Local Assistants and is designed and optimized for deploying AI assistants based on large language models (LLMs) in a private environment. Myla provides an API compatible with the **OpenAI assistants API**, with support for multiple LLM backends. Whether on a laptop or a production server, you can quickly develop and run AI assistants.\n\n## Key Features:\n\n* Support for OpenAI API and compatible LLM services\n* Assistant API compatible with OpenAI\n* Vector retrieval (FAISS/LanceDB)\n* sentence_transformers\n* WebUI\n* Tool extensions\n* Document Q&A (in progress)\n\n## Quick Start\n### Installation\n\nPython version requirement: >= 3.9\n\nMyla can be installed from PyPI using `pip`. It is recommended to create a new virtual environment before installation to avoid conflicts.\n\n```bash\npip install myla\n```\n\nIf you need Retrieval, please install all dependencies:\n```bash\npip install \"myla[all]\"\n```\n\n### Configuration\n\nMyla supports using an OpenAI API-compatible LLM service as the backend. You can use the OpenAI API directly or deploy your own local LLM. If you want to deploy a local LLM, it is recommended to use [Xorbits Inference](https://github.com/xorbitsai/inference).\n\nCreate a `.env` file in the current directory with the following content:\n\n```\n# LLM configuration\nLLM_ENDPOINT=https://api.openai.com/v1/\nLLM_API_KEY=sk-xx\nDEFAULT_LLM_MODEL_NAME=gpt-3.5-turbo\n```\n\nMore configurations can be found in: [env-example.txt](env-example.txt)\n\n#### ChatGLM as backend for your MacBook\n\nMyla supports running ChatGLM locally using `chatglm.cpp` as the backend. To install the Python Binding, refer to: https://github.com/li-plus/chatglm.cpp#python-binding \n\n`.env` configuration example:\n\n```\nDEFAULT_LLM_MODEL_NAME=chatglm@/Users/shellc/Workspaces/chatglm.cpp/chatglm-ggml.bin\n```\n\n### Start\n\n```bash\nmyla\n```\n\nor\n\n```bash\npython -m myla\n```\n\nFor more startup options:\n```bash\nmyla --help\n```\n\n### WebUI\n\nMyla provides a simple web interface that makes it easy to develop and debug assistants.\n\nAccess from your browser: http://localhost:2000/\n\n![Screenshot](myla/webui/static/images/screenshot.png)\n\n### API\n\nYou can directly use the OpenAI python SDK to access the assistants API.\n\n* API Docs: http://localhost:2000/api/docs\n* Swagger: http://localhost:2000/api/swagger\n\n\n## Community\n\nMyla is still under rapid development, and community contributions are welcome.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A local implementation of OpenAI Assistants API: Myla stands for MY Local Assistant",
    "version": "0.2.24",
    "project_urls": {
        "Homepage": "https://github.com/muyuworks/myla"
    },
    "split_keywords": [
        "ai assistant",
        " llm",
        " myla",
        " chatbot"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "db362e9e3ff8128a785c9c74735c9b03a36798b24896ea65d8ebe189b94603e5",
                "md5": "b65f162d0a80993361534d8186ec6b8c",
                "sha256": "b5bc817b3a0f8364418ba067343e428e229839968954cfbea69d7f8fa8519dc3"
            },
            "downloads": -1,
            "filename": "myla-0.2.24-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b65f162d0a80993361534d8186ec6b8c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 576397,
            "upload_time": "2024-04-11T09:21:27",
            "upload_time_iso_8601": "2024-04-11T09:21:27.956179Z",
            "url": "https://files.pythonhosted.org/packages/db/36/2e9e3ff8128a785c9c74735c9b03a36798b24896ea65d8ebe189b94603e5/myla-0.2.24-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-11 09:21:27",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "muyuworks",
    "github_project": "myla",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "uvicorn",
            "specs": []
        },
        {
            "name": "starlette",
            "specs": []
        },
        {
            "name": "fastapi",
            "specs": []
        },
        {
            "name": "python-dotenv",
            "specs": []
        },
        {
            "name": "jinja2",
            "specs": []
        },
        {
            "name": "aiohttp",
            "specs": []
        },
        {
            "name": "sqlmodel",
            "specs": []
        },
        {
            "name": "openai",
            "specs": []
        },
        {
            "name": "python-multipart",
            "specs": []
        },
        {
            "name": "pandas",
            "specs": []
        },
        {
            "name": "aiofiles",
            "specs": []
        }
    ],
    "lcname": "myla"
}
        
Elapsed time: 0.22629s