unillm


Nameunillm JSON
Version 0.0.8 PyPI version JSON
download
home_pagehttps://github.com/fuzihaofzh/unillm
SummaryUnified Large Language Model Interface for ChatGPT, LLaMA, Mistral, Claude, and RAG
upload_time2024-04-19 08:07:23
maintainerNone
docs_urlNone
authorYour Name
requires_pythonNone
licenseMIT
keywords language models ai nlp chatgpt llama mistral claude mistralai rag
VCS
bugtrack_url
requirements openai torch transformers yaml peft anthropic mistralai llama_index fire
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # UniLLM: Unified Large Language Model Interface

[![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT)
[![PyPI](https://img.shields.io/pypi/v/unillm.svg)](https://pypi.org/project/unillm/)
[![GitHub stars](https://img.shields.io/github/stars/fuzihaofzh/unillm?style=social)](https://github.com/fuzihaofzh/unillm)
[![Documentation Status](https://readthedocs.org/projects/unillm/badge/?version=latest)](https://unillm.readthedocs.io/en/latest/?badge=latest)

UniLLM is a versatile Python library and command-line tool designed to provide unified access to various large language models such as [ChatGPT](https://openai.com/chatgpt), [Llama2](https://llama.meta.com/), [Mistral](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2), [Claude](https://www.anthropic.com/), [MistralAI](https://mistral.ai/), [RAG](https://www.llamaindex.ai/), [Llama3](https://llama.meta.com/), and [CommandRPlus](https://cohere.ai/). This library simplifies the integration of these models into your projects or allows for direct interaction via the command line.

## Features

- Unified API for interacting with multiple language models.
- Support for both API and local models.
- Extensible framework allowing the addition of more models in the future.
- Command-line tool for easy interaction with models.
- Configuration via YAML file for API keys.

## Installation

Install UniLLM using pip:

```bash
pip install unillm
```

## Configuration

Configure your API keys for the models by creating a `.unillm.yaml` file in your home directory:

```yaml
chatgpt: YOUR_CHATGPT_API_KEY
claude: YOUR_CLAUDE_API_KEY
mistralai: YOUR_MISTRALAI_API_KEY
# Add other model API keys as needed
```

## Supported Models

| Model         | Support API | Support Local |
|---------------|:-----------:|:-------------:|
| ChatGPT       | ✅          |               |
| Llama2        |             | ✅            |
| Mistral       | ✅          | ✅            |
| Claude        | ✅          |               |
| MistralAI     | ✅          |               |
| RAG           | ✅          | ✅            |
| Llama3        |             | ✅            |
| CommandRPlus  |             | ✅            |

## Usage

### As a Python Library

Interact with language models seamlessly in your Python projects:

```python
from unillm import UniLLM

# Initialize Llama with specific settings
model = UniLLM('Llama2', peft_path="path_to_peft_model", max_new_tokens=1024)

# Generate a response
response = model.generate_response("How can AI help humans?")
print(response)
```

### As a Command-Line Tool

Start the CLI by running:

```bash
unillm
```

Follow the prompts to select a model and enter your queries. For example:

```bash
Please choose a model by number (default is 1):
1: ChatGPT
2: Llama2
...

👨Please Ask a Question: What are the latest AI trends?
🤖 (ChatGPT): AI trends include...
```

To exit, type `exit`.

## Contributing

We welcome contributions! If you have suggestions or enhancements, fork the repository, create a feature branch, and submit a pull request.

## License

This project is licensed under the MIT License - see the LICENSE file for details.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/fuzihaofzh/unillm",
    "name": "unillm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "language models AI NLP ChatGPT Llama Mistral Claude MistralAI RAG",
    "author": "Your Name",
    "author_email": "your.email@example.com",
    "download_url": "https://files.pythonhosted.org/packages/9a/80/ac0dfbffbab7eb315427f387e59611d971182168f7e9d96740c5c3a41f82/unillm-0.0.8.tar.gz",
    "platform": null,
    "description": "# UniLLM: Unified Large Language Model Interface\n\n[![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT)\n[![PyPI](https://img.shields.io/pypi/v/unillm.svg)](https://pypi.org/project/unillm/)\n[![GitHub stars](https://img.shields.io/github/stars/fuzihaofzh/unillm?style=social)](https://github.com/fuzihaofzh/unillm)\n[![Documentation Status](https://readthedocs.org/projects/unillm/badge/?version=latest)](https://unillm.readthedocs.io/en/latest/?badge=latest)\n\nUniLLM is a versatile Python library and command-line tool designed to provide unified access to various large language models such as [ChatGPT](https://openai.com/chatgpt), [Llama2](https://llama.meta.com/), [Mistral](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2), [Claude](https://www.anthropic.com/), [MistralAI](https://mistral.ai/), [RAG](https://www.llamaindex.ai/), [Llama3](https://llama.meta.com/), and [CommandRPlus](https://cohere.ai/). This library simplifies the integration of these models into your projects or allows for direct interaction via the command line.\n\n## Features\n\n- Unified API for interacting with multiple language models.\n- Support for both API and local models.\n- Extensible framework allowing the addition of more models in the future.\n- Command-line tool for easy interaction with models.\n- Configuration via YAML file for API keys.\n\n## Installation\n\nInstall UniLLM using pip:\n\n```bash\npip install unillm\n```\n\n## Configuration\n\nConfigure your API keys for the models by creating a `.unillm.yaml` file in your home directory:\n\n```yaml\nchatgpt: YOUR_CHATGPT_API_KEY\nclaude: YOUR_CLAUDE_API_KEY\nmistralai: YOUR_MISTRALAI_API_KEY\n# Add other model API keys as needed\n```\n\n## Supported Models\n\n| Model         | Support API | Support Local |\n|---------------|:-----------:|:-------------:|\n| ChatGPT       | \u2705          |               |\n| Llama2        |             | \u2705            |\n| Mistral       | \u2705          | \u2705            |\n| Claude        | \u2705          |               |\n| MistralAI     | \u2705          |               |\n| RAG           | \u2705          | \u2705            |\n| Llama3        |             | \u2705            |\n| CommandRPlus  |             | \u2705            |\n\n## Usage\n\n### As a Python Library\n\nInteract with language models seamlessly in your Python projects:\n\n```python\nfrom unillm import UniLLM\n\n# Initialize Llama with specific settings\nmodel = UniLLM('Llama2', peft_path=\"path_to_peft_model\", max_new_tokens=1024)\n\n# Generate a response\nresponse = model.generate_response(\"How can AI help humans?\")\nprint(response)\n```\n\n### As a Command-Line Tool\n\nStart the CLI by running:\n\n```bash\nunillm\n```\n\nFollow the prompts to select a model and enter your queries. For example:\n\n```bash\nPlease choose a model by number (default is 1):\n1: ChatGPT\n2: Llama2\n...\n\n\ud83d\udc68Please Ask a Question: What are the latest AI trends?\n\ud83e\udd16 (ChatGPT): AI trends include...\n```\n\nTo exit, type `exit`.\n\n## Contributing\n\nWe welcome contributions! If you have suggestions or enhancements, fork the repository, create a feature branch, and submit a pull request.\n\n## License\n\nThis project is licensed under the MIT License - see the LICENSE file for details.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Unified Large Language Model Interface for ChatGPT, LLaMA, Mistral, Claude, and RAG",
    "version": "0.0.8",
    "project_urls": {
        "Homepage": "https://github.com/fuzihaofzh/unillm"
    },
    "split_keywords": [
        "language",
        "models",
        "ai",
        "nlp",
        "chatgpt",
        "llama",
        "mistral",
        "claude",
        "mistralai",
        "rag"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "70eb83935e6978bdadce05f9d5da23c489ddf9d8cf5a682e8792403296580b30",
                "md5": "b74ab0330c3149c04f05da817f6cd359",
                "sha256": "e7000324536030ebb42b7ac9fa587b5cd3ad920d56febfb8ecebf2f5e456469b"
            },
            "downloads": -1,
            "filename": "unillm-0.0.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b74ab0330c3149c04f05da817f6cd359",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 8844,
            "upload_time": "2024-04-19T08:07:22",
            "upload_time_iso_8601": "2024-04-19T08:07:22.339547Z",
            "url": "https://files.pythonhosted.org/packages/70/eb/83935e6978bdadce05f9d5da23c489ddf9d8cf5a682e8792403296580b30/unillm-0.0.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9a80ac0dfbffbab7eb315427f387e59611d971182168f7e9d96740c5c3a41f82",
                "md5": "d0299a5fd70b053aa28540fb68eac779",
                "sha256": "051e13b12f23c1f135277cdb01089bdd86169268ac12bb07eee80c4cb9ce01b0"
            },
            "downloads": -1,
            "filename": "unillm-0.0.8.tar.gz",
            "has_sig": false,
            "md5_digest": "d0299a5fd70b053aa28540fb68eac779",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 10396,
            "upload_time": "2024-04-19T08:07:23",
            "upload_time_iso_8601": "2024-04-19T08:07:23.608699Z",
            "url": "https://files.pythonhosted.org/packages/9a/80/ac0dfbffbab7eb315427f387e59611d971182168f7e9d96740c5c3a41f82/unillm-0.0.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-19 08:07:23",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "fuzihaofzh",
    "github_project": "unillm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "openai",
            "specs": []
        },
        {
            "name": "torch",
            "specs": []
        },
        {
            "name": "transformers",
            "specs": []
        },
        {
            "name": "yaml",
            "specs": []
        },
        {
            "name": "peft",
            "specs": []
        },
        {
            "name": "anthropic",
            "specs": []
        },
        {
            "name": "mistralai",
            "specs": []
        },
        {
            "name": "llama_index",
            "specs": []
        },
        {
            "name": "fire",
            "specs": []
        }
    ],
    "lcname": "unillm"
}
        
Elapsed time: 0.24472s