chatproxy


Namechatproxy JSON
Version 0.0.1 PyPI version JSON
download
home_pagehttps://github.com/saaiie/chatproxy
SummaryLanguage Model Proxy
upload_time2023-07-24 14:25:43
maintainer
docs_urlNone
authorjaypark
requires_python>=3.10
license
keywords chatbot ai ml llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ChatProxy

The goal of ChatProxy is providing python library that enables developers to effortlessly interface with various LLMs such as OpenAI's GPT-4, Google's BERT, and others. 

This package encapsulates the complex interactions with these LLMs behind intuitive and user-friendly proxy classes, allowing developers to focus on leveraging the power of these advanced models, without being bogged down by intricate API details and interfacing code.

## Features (WIP)

1. **Multiple LLM Support:** The package supports multiple LLMs, including but not limited to GPT-4, BERT, and others. It is also designed with flexibility in mind to accommodate future models.

2. **Uniform API:** It provides a uniform API for different LLMs, allowing developers to switch between models without the need to extensively change the codebase.

3. **Error Handling:** It includes built-in error handling and retry logic, ensuring that your application remains resilient against minor network hiccups and transient errors.

4. **Optimized Performance:** With caching and other optimizations, this package makes sure you get the best possible performance out of your chosen LLM.

5. **Asynchronous Support:** For developers who require high performance and non-blocking code execution, this package offers asynchronous methods.

## Installation

You can install the `chatproxy` package via pip:

```bash
pip install chatproxy
```

## Quickstart

Here's an example of how to create a ChatGPT proxy and use it to generate text:

```python
import chatproxy

session = chatproxy.Session()
while (prompt := input("you : ")) != "exit":
    message = session.send(prompt)
    print(f"\n{message['role']} : {message['content']}\n")
```

## Documentation

For detailed information on using this package, please refer to our [documentation](link_to_documentation).

## Contributing

We welcome contributions! Please see our [contributing guidelines](link_to_contributing_guidelines) for details.

## License

This project is licensed under the terms of the MIT license. See [LICENSE](link_to_license) for more details.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/saaiie/chatproxy",
    "name": "chatproxy",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "",
    "keywords": "chatbot,ai,ml,llm",
    "author": "jaypark",
    "author_email": "jaypark@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/b0/d9/62dbe8677728b70e518cc00ba710ffb8ec58dcf0f30a3229398cb75019d7/chatproxy-0.0.1.tar.gz",
    "platform": null,
    "description": "# ChatProxy\n\nThe goal of ChatProxy is providing python library that enables developers to effortlessly interface with various LLMs such as OpenAI's GPT-4, Google's BERT, and others. \n\nThis package encapsulates the complex interactions with these LLMs behind intuitive and user-friendly proxy classes, allowing developers to focus on leveraging the power of these advanced models, without being bogged down by intricate API details and interfacing code.\n\n## Features (WIP)\n\n1. **Multiple LLM Support:** The package supports multiple LLMs, including but not limited to GPT-4, BERT, and others. It is also designed with flexibility in mind to accommodate future models.\n\n2. **Uniform API:** It provides a uniform API for different LLMs, allowing developers to switch between models without the need to extensively change the codebase.\n\n3. **Error Handling:** It includes built-in error handling and retry logic, ensuring that your application remains resilient against minor network hiccups and transient errors.\n\n4. **Optimized Performance:** With caching and other optimizations, this package makes sure you get the best possible performance out of your chosen LLM.\n\n5. **Asynchronous Support:** For developers who require high performance and non-blocking code execution, this package offers asynchronous methods.\n\n## Installation\n\nYou can install the `chatproxy` package via pip:\n\n```bash\npip install chatproxy\n```\n\n## Quickstart\n\nHere's an example of how to create a ChatGPT proxy and use it to generate text:\n\n```python\nimport chatproxy\n\nsession = chatproxy.Session()\nwhile (prompt := input(\"you : \")) != \"exit\":\n    message = session.send(prompt)\n    print(f\"\\n{message['role']} : {message['content']}\\n\")\n```\n\n## Documentation\n\nFor detailed information on using this package, please refer to our [documentation](link_to_documentation).\n\n## Contributing\n\nWe welcome contributions! Please see our [contributing guidelines](link_to_contributing_guidelines) for details.\n\n## License\n\nThis project is licensed under the terms of the MIT license. See [LICENSE](link_to_license) for more details.\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Language Model Proxy",
    "version": "0.0.1",
    "project_urls": {
        "Homepage": "https://github.com/saaiie/chatproxy"
    },
    "split_keywords": [
        "chatbot",
        "ai",
        "ml",
        "llm"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "07008c3439baa9d9fe60fe5dbfbe3b3529fc0c6a5b33fcc6d0cc75c42372a7aa",
                "md5": "e545358f289ab551cf31d2afdb4af9ea",
                "sha256": "86188779b01a9e41f5577ab00ebee16d549493187bc31585606b666f9b730d26"
            },
            "downloads": -1,
            "filename": "chatproxy-0.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e545358f289ab551cf31d2afdb4af9ea",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 3653,
            "upload_time": "2023-07-24T14:25:42",
            "upload_time_iso_8601": "2023-07-24T14:25:42.175525Z",
            "url": "https://files.pythonhosted.org/packages/07/00/8c3439baa9d9fe60fe5dbfbe3b3529fc0c6a5b33fcc6d0cc75c42372a7aa/chatproxy-0.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b0d962dbe8677728b70e518cc00ba710ffb8ec58dcf0f30a3229398cb75019d7",
                "md5": "16d2cf84dd92192f104884f870b4e393",
                "sha256": "3dcc7932f0451b4ee5a3cedf8b8d5345e39a40a8c504590186f3b71b6b9a8647"
            },
            "downloads": -1,
            "filename": "chatproxy-0.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "16d2cf84dd92192f104884f870b4e393",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 2930,
            "upload_time": "2023-07-24T14:25:43",
            "upload_time_iso_8601": "2023-07-24T14:25:43.936264Z",
            "url": "https://files.pythonhosted.org/packages/b0/d9/62dbe8677728b70e518cc00ba710ffb8ec58dcf0f30a3229398cb75019d7/chatproxy-0.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-07-24 14:25:43",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "saaiie",
    "github_project": "chatproxy",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "chatproxy"
}
        
Elapsed time: 0.09762s