# This is a personal library developed by Qichang Zheng.
## Installation
```bash
pip install qichang
```
## LLM Conversation
### For single conversation
```python
import qichang
llm = qichang.LLM_API()
llm.chat('GPT3.5', 'Hello, how are you?')
llm.chat('GPT4', 'Hello, how are you?')
```
### For multi-turn conversation
#### We currently have two servers in Virginia and Singapore. This function will automatically choose the server with the lowest latency, but you can also manually set the server.
```python
import qichang
llm = qichang.LLM_API()
# Manually set the server
# llm.server = 'Virginia'
# llm.server = 'Singapore'
# Here we need to specify the chatID (string) to distinguish different conversations
llm.chat('GPT3.5', 'Hello, how are you?', 'ChatID')
llm.chat('GPT4', 'What did I just asked?', 'ChatID')
```
## Model Downloader
#### This is a tool to download the huggingface models in China. Note that this function only works for some models, the author is working on further improvement.
```python
import qichang
downloader = qichang.Model_Downloader()
downloader.download('Qwen/Qwen-7B-Chat', 'test') # Download the model to the folder 'test'
```
## Davinci Embedding
### This is a tool to get the embedding of the text from the Davinci model.
```python
import os
import qichang
os.environ["OPENAI_API_KEY"] = "your_api_key"
Embedder = qichang.Embedder()
Embedder.embedding('Hello, how are you?')
```
Raw data
{
"_id": null,
"home_page": "https://github.com/QichangZheng/qichang.git",
"name": "qichang",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "language models API client",
"author": "Qichang Zheng",
"author_email": "qichangzheng@uchicago.edu",
"download_url": "https://files.pythonhosted.org/packages/9a/98/f5b02892c802900e7e37244f0f447b8823332acde75769d13c8f495b856e/qichang-0.0.59.tar.gz",
"platform": null,
"description": "# This is a personal library developed by Qichang Zheng.\n## Installation\n```bash\npip install qichang\n```\n\n## LLM Conversation\n### For single conversation\n```python\nimport qichang\nllm = qichang.LLM_API()\nllm.chat('GPT3.5', 'Hello, how are you?')\nllm.chat('GPT4', 'Hello, how are you?')\n```\n\n### For multi-turn conversation\n#### We currently have two servers in Virginia and Singapore. This function will automatically choose the server with the lowest latency, but you can also manually set the server.\n```python\nimport qichang\nllm = qichang.LLM_API()\n# Manually set the server\n# llm.server = 'Virginia'\n# llm.server = 'Singapore'\n# Here we need to specify the chatID (string) to distinguish different conversations\nllm.chat('GPT3.5', 'Hello, how are you?', 'ChatID')\nllm.chat('GPT4', 'What did I just asked?', 'ChatID')\n```\n\n## Model Downloader\n#### This is a tool to download the huggingface models in China. Note that this function only works for some models, the author is working on further improvement.\n```python\nimport qichang\ndownloader = qichang.Model_Downloader()\ndownloader.download('Qwen/Qwen-7B-Chat', 'test') # Download the model to the folder 'test'\n```\n\n## Davinci Embedding\n### This is a tool to get the embedding of the text from the Davinci model.\n```python\nimport os\nimport qichang\n\nos.environ[\"OPENAI_API_KEY\"] = \"your_api_key\"\n\nEmbedder = qichang.Embedder()\nEmbedder.embedding('Hello, how are you?')\n```\n",
"bugtrack_url": null,
"license": "Apache License 2.0",
"summary": "A Python library for interacting with various language model APIs",
"version": "0.0.59",
"project_urls": {
"Homepage": "https://github.com/QichangZheng/qichang.git"
},
"split_keywords": [
"language",
"models",
"api",
"client"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "441e8eadeda4e1c6b11ac76d97e9a7c221a6a8d2704b9b9f2bd79816f216f74e",
"md5": "5643d65b2f34d99d28bf70ff004bf2c3",
"sha256": "4d1418ec2000203ef1ed798a2d2f7e803d22073401626bfbfc2e4c10a3091c7f"
},
"downloads": -1,
"filename": "qichang-0.0.59-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5643d65b2f34d99d28bf70ff004bf2c3",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 16203,
"upload_time": "2024-11-02T00:23:29",
"upload_time_iso_8601": "2024-11-02T00:23:29.753632Z",
"url": "https://files.pythonhosted.org/packages/44/1e/8eadeda4e1c6b11ac76d97e9a7c221a6a8d2704b9b9f2bd79816f216f74e/qichang-0.0.59-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9a98f5b02892c802900e7e37244f0f447b8823332acde75769d13c8f495b856e",
"md5": "15c7871e4a614c4074561b455ab39ff6",
"sha256": "bf758a89fbde68ee63c8d383aa8a5da02cf1a70fe69862ce1dd016051ffab6d6"
},
"downloads": -1,
"filename": "qichang-0.0.59.tar.gz",
"has_sig": false,
"md5_digest": "15c7871e4a614c4074561b455ab39ff6",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 12234,
"upload_time": "2024-11-02T00:23:31",
"upload_time_iso_8601": "2024-11-02T00:23:31.148179Z",
"url": "https://files.pythonhosted.org/packages/9a/98/f5b02892c802900e7e37244f0f447b8823332acde75769d13c8f495b856e/qichang-0.0.59.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-02 00:23:31",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "QichangZheng",
"github_project": "qichang",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "qichang"
}