ConnectLM


NameConnectLM JSON
Version 0.0.1 PyPI version JSON
download
home_pagehttps://github.com/saaiie/ConnectLM
SummaryLanguage Model Proxy
upload_time2023-08-09 05:01:32
maintainer
docs_urlNone
authorjaypark
requires_python>=3.10
license
keywords chatbot ai ml llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ConnectLM

The goal of ConnectLM is providing python library that enables developers to effortlessly interface with various LLMs such as OpenAI's GPT-4, Google's BERT, and others. 

This package encapsulates the complex interactions with these LLMs behind intuitive and user-friendly proxy classes, allowing developers to focus on leveraging the power of these advanced models, without being bogged down by intricate API details and interfacing code.

## Features (WIP)

1. **Multiple LLM Support:** The package supports multiple LLMs, including but not limited to GPT-4, BERT, and others. It is also designed with flexibility in mind to accommodate future models.

2. **Uniform API:** It provides a uniform API for different LLMs, allowing developers to switch between models without the need to extensively change the codebase.

3. **Error Handling:** It includes built-in error handling and retry logic, ensuring that your application remains resilient against minor network hiccups and transient errors.

4. **Optimized Performance:** With caching and other optimizations, this package makes sure you get the best possible performance out of your chosen LLM.

5. **Asynchronous Support:** For developers who require high performance and non-blocking code execution, this package offers asynchronous methods.

## Installation

You can install the `connectlm` package via pip:

```bash
pip install connectlm
```


## Quickstart

Here's an example of how to connect to ChatGPT and use it to generate text:

```python
import connectlm as cm

query = cm.QueryChat()
while (prompt := input("you : ")) != "exit":
    message = query.send(prompt)
    print(f"\n{message['role']} : {message['content']}\n")
```


## Documentation

For detailed information on using this package, please refer to our [documentation](link_to_documentation).


## Contributing

We welcome contributions! Please see our [contributing guidelines](link_to_contributing_guidelines) for details.


## License

This project is licensed under the terms of the MIT license. See [LICENSE](link_to_license) for more details.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/saaiie/ConnectLM",
    "name": "ConnectLM",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "",
    "keywords": "chatbot,ai,ml,llm",
    "author": "jaypark",
    "author_email": "jaypark@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/d4/b9/19ee228f671e7941eb80c143f2b337e2e08750584cc386df8e76370ed230/ConnectLM-0.0.1.tar.gz",
    "platform": null,
    "description": "# ConnectLM\n\nThe goal of ConnectLM is providing python library that enables developers to effortlessly interface with various LLMs such as OpenAI's GPT-4, Google's BERT, and others. \n\nThis package encapsulates the complex interactions with these LLMs behind intuitive and user-friendly proxy classes, allowing developers to focus on leveraging the power of these advanced models, without being bogged down by intricate API details and interfacing code.\n\n## Features (WIP)\n\n1. **Multiple LLM Support:** The package supports multiple LLMs, including but not limited to GPT-4, BERT, and others. It is also designed with flexibility in mind to accommodate future models.\n\n2. **Uniform API:** It provides a uniform API for different LLMs, allowing developers to switch between models without the need to extensively change the codebase.\n\n3. **Error Handling:** It includes built-in error handling and retry logic, ensuring that your application remains resilient against minor network hiccups and transient errors.\n\n4. **Optimized Performance:** With caching and other optimizations, this package makes sure you get the best possible performance out of your chosen LLM.\n\n5. **Asynchronous Support:** For developers who require high performance and non-blocking code execution, this package offers asynchronous methods.\n\n## Installation\n\nYou can install the `connectlm` package via pip:\n\n```bash\npip install connectlm\n```\n\n\n## Quickstart\n\nHere's an example of how to connect to ChatGPT and use it to generate text:\n\n```python\nimport connectlm as cm\n\nquery = cm.QueryChat()\nwhile (prompt := input(\"you : \")) != \"exit\":\n    message = query.send(prompt)\n    print(f\"\\n{message['role']} : {message['content']}\\n\")\n```\n\n\n## Documentation\n\nFor detailed information on using this package, please refer to our [documentation](link_to_documentation).\n\n\n## Contributing\n\nWe welcome contributions! Please see our [contributing guidelines](link_to_contributing_guidelines) for details.\n\n\n## License\n\nThis project is licensed under the terms of the MIT license. See [LICENSE](link_to_license) for more details.\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Language Model Proxy",
    "version": "0.0.1",
    "project_urls": {
        "Homepage": "https://github.com/saaiie/ConnectLM"
    },
    "split_keywords": [
        "chatbot",
        "ai",
        "ml",
        "llm"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "50e242718311d3dbefe177685cd23d6ce599704dd4d6b67169c396aa067f39d9",
                "md5": "e0fa053783c204d87c25cabb6347d14b",
                "sha256": "306fb63b65a17929ec67e44ed5ac3f49279d65a89bbcf6ae55fe2e3c60efd319"
            },
            "downloads": -1,
            "filename": "ConnectLM-0.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e0fa053783c204d87c25cabb6347d14b",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 5882,
            "upload_time": "2023-08-09T05:01:30",
            "upload_time_iso_8601": "2023-08-09T05:01:30.273784Z",
            "url": "https://files.pythonhosted.org/packages/50/e2/42718311d3dbefe177685cd23d6ce599704dd4d6b67169c396aa067f39d9/ConnectLM-0.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d4b919ee228f671e7941eb80c143f2b337e2e08750584cc386df8e76370ed230",
                "md5": "9b3ae05533b7327b8ef388d19570a7a4",
                "sha256": "021f44a1db9d7496f89d2adfe0df3800660b2b6bf5f81bed06c64596980745ac"
            },
            "downloads": -1,
            "filename": "ConnectLM-0.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "9b3ae05533b7327b8ef388d19570a7a4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 4766,
            "upload_time": "2023-08-09T05:01:32",
            "upload_time_iso_8601": "2023-08-09T05:01:32.121735Z",
            "url": "https://files.pythonhosted.org/packages/d4/b9/19ee228f671e7941eb80c143f2b337e2e08750584cc386df8e76370ed230/ConnectLM-0.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-09 05:01:32",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "saaiie",
    "github_project": "ConnectLM",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "connectlm"
}
        
Elapsed time: 0.09866s