juchats


Namejuchats JSON
Version 0.0.3 PyPI version JSON
download
home_pageNone
SummaryNone
upload_time2024-08-21 12:29:06
maintainerNone
docs_urlNone
authorultrasev
requires_python<4.0,>=3.8
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">
<figure style="text-align: center; radius:10pt">
    <img src="https://s3.bmp.ovh/imgs/2024/07/29/b6995f3a712d6586.png" width=239pt radius=10pt>
</figure>

<a href='https://follow-your-click.github.io/'><img src='https://img.shields.io/badge/Project-Page-Green'></a> ![visitors](https://visitor-badge.laobi.icu/badge?page_id=ultrasev.juchats&left_color=green&right_color=red) [![GitHub](https://img.shields.io/github/stars/ultrasev/juchats?style=social)](https://github.com/ultrasev/juchats)

</div>

# [Juchats](https://dlj.one/RNFYxz9) API wrapper

`juchats` lib is a Python library designed for interacting with the Juchats API, enabling seamless integration of chat functionalities into your applications. By utilizing this library, developers can leverage the power of advanced models like GPT-4, Claude Mezzo, and deepseek to perform various chat-related tasks.

# Installation

```bash
pip3 install juchats
```

# Usage

First, obtain your token from from [Juchats](https://dlj.one/RNFYxz9) official website, and place it into a `.env` file.

```bash
JTOKEN=your_token
```

## Basic Chat Interaction

This example demonstrates a simple chat interaction where we ask the model to compare two floating-point numbers.

```python
import os
from juchats.chat import Juchats
from dotenv import load_dotenv
import asyncio
load_dotenv()

async def api():
    token = os.getenv('JTOKEN')
    juchats = Juchats(token, model='deepseek-chat')

    async with juchats:
        await juchats.chat("3.11, 3.9 两个浮点数谁大?具体分析一下,给出你的原因", show_stream=True)

if __name__ == '__main__':
    asyncio.run(api())


''' Output
3.11 大,因为 3.11 > 3.9
'''
```

## Structured JSON output

This example demonstrates how to obtain structured JSON output from the chat API.

````python
import os
from juchats.chat import Juchats
from dotenv import load_dotenv
import asyncio
load_dotenv()

async def api():
    token = os.getenv('JTOKEN')
    juchats = Juchats(token, model='deepseek-chat')
    prompt = "每个月有多少天?以 JSON 格式给出答案,例如:{\"January\": 31, \"February\": 28, ...}"
    async with juchats:
        text = await juchats.chat(prompt)
        print(text)

''' Output
```json
{
    "January": 31,
    "February": 28,
    "March": 31,
    "April": 30,
    "May": 31,
    "June": 30,
    "July": 31,
    "August": 31,
    "September": 30,
    "October": 31,
    "November": 30,
    "December": 31
}
'''
````

# Available Models

By 2024-07-25, the available models (may be outdated) are:

| Model ID | Backend Model Name                 | Front Model Name         |
| -------- | ---------------------------------- | ------------------------ |
| 5        | claude-3-haiku-20240307            | Claude Mezzo             |
| 6        | claude-3-opus-20240229             | Claude3 Opus             |
| 7        | mistralai/mixtral-8x22b-instruct   | Mixtral Forte            |
| 9        | gpt-4-turbo-2024-04-09             | GPT Mezzo                |
| 10       | gpt-4-turbo-2024-04-09             | GPT Forte                |
| 11       | dall-e-3                           | DALL · E3                |
| 12       | meta-llama/llama-3-70b-instruct    | Llama3 70B               |
| 13       | google/gemini-pro-1.5              | Gemini 1.5 Pro           |
| 14       | deepseek-chat                      | Deepseek                 |
| 15       | google/gemini-flash-1.5            | Gemini-flash             |
| 16       | gpt-4o-2024-05-13                  | GPT4o                    |
| 17       | claude-3-opus-20240229             | Claude3 Opus(100K)       |
| 18       | Stable Image Ultra                 | Stable Diffusion 3 Ultra |
| 19       | claude-3-5-sonnet-20240620         | Claude 3.5 Sonnet        |
| 20       | gpt-4o-mini-2024-07-18             | GPT4o-mini               |
| 21       | meta-llama/llama-3.1-405b-instruct | Llama3.1 405B            |

## Get real time available models

Dynamically retrieve the latest available models from the Juchats API.

```python
import os
from juchats.chat import Juchats
from dotenv import load_dotenv
import asyncio
load_dotenv()
token = os.getenv('JTOKEN')
juchats = Juchats(token, model='gpt-4o-2024-05-13')
print(
    asyncio.run(juchats.get_models())
)
```

# Note

- **Streaming**: Set `show_stream=True` to display the chat response in real-time. Use `show_stream=False` to disable it.
- **Model Selection**: Specify the backend model name with the `model` parameter. Refer to the available models table above for options.
- **API Token**: Obtain your token from [Juchats](https://dlj.one/RNFYxz9) and use it to authenticate requests.
- **Rate Limiting**: The API supports up to 3 queries per second (QPS). For higher limits, consider using the Deepseek API or OpenAI API.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "juchats",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "ultrasev",
    "author_email": "i@ultrasev.com",
    "download_url": "https://files.pythonhosted.org/packages/8e/56/03dc019a6c1073507e6dd3cc774778889bc6fce18069043741b1f0d36e4b/juchats-0.0.3.tar.gz",
    "platform": null,
    "description": "<div align=\"center\">\n<figure style=\"text-align: center; radius:10pt\">\n    <img src=\"https://s3.bmp.ovh/imgs/2024/07/29/b6995f3a712d6586.png\" width=239pt radius=10pt>\n</figure>\n\n<a href='https://follow-your-click.github.io/'><img src='https://img.shields.io/badge/Project-Page-Green'></a> ![visitors](https://visitor-badge.laobi.icu/badge?page_id=ultrasev.juchats&left_color=green&right_color=red) [![GitHub](https://img.shields.io/github/stars/ultrasev/juchats?style=social)](https://github.com/ultrasev/juchats)\n\n</div>\n\n# [Juchats](https://dlj.one/RNFYxz9) API wrapper\n\n`juchats` lib is a Python library designed for interacting with the Juchats API, enabling seamless integration of chat functionalities into your applications. By utilizing this library, developers can leverage the power of advanced models like GPT-4, Claude Mezzo, and deepseek to perform various chat-related tasks.\n\n# Installation\n\n```bash\npip3 install juchats\n```\n\n# Usage\n\nFirst, obtain your token from from [Juchats](https://dlj.one/RNFYxz9) official website, and place it into a `.env` file.\n\n```bash\nJTOKEN=your_token\n```\n\n## Basic Chat Interaction\n\nThis example demonstrates a simple chat interaction where we ask the model to compare two floating-point numbers.\n\n```python\nimport os\nfrom juchats.chat import Juchats\nfrom dotenv import load_dotenv\nimport asyncio\nload_dotenv()\n\nasync def api():\n    token = os.getenv('JTOKEN')\n    juchats = Juchats(token, model='deepseek-chat')\n\n    async with juchats:\n        await juchats.chat(\"3.11, 3.9 \u4e24\u4e2a\u6d6e\u70b9\u6570\u8c01\u5927\uff1f\u5177\u4f53\u5206\u6790\u4e00\u4e0b\uff0c\u7ed9\u51fa\u4f60\u7684\u539f\u56e0\", show_stream=True)\n\nif __name__ == '__main__':\n    asyncio.run(api())\n\n\n''' Output\n3.11 \u5927\uff0c\u56e0\u4e3a 3.11 > 3.9\n'''\n```\n\n## Structured JSON output\n\nThis example demonstrates how to obtain structured JSON output from the chat API.\n\n````python\nimport os\nfrom juchats.chat import Juchats\nfrom dotenv import load_dotenv\nimport asyncio\nload_dotenv()\n\nasync def api():\n    token = os.getenv('JTOKEN')\n    juchats = Juchats(token, model='deepseek-chat')\n    prompt = \"\u6bcf\u4e2a\u6708\u6709\u591a\u5c11\u5929\uff1f\u4ee5 JSON \u683c\u5f0f\u7ed9\u51fa\u7b54\u6848\uff0c\u4f8b\u5982\uff1a{\\\"January\\\": 31, \\\"February\\\": 28, ...}\"\n    async with juchats:\n        text = await juchats.chat(prompt)\n        print(text)\n\n''' Output\n```json\n{\n    \"January\": 31,\n    \"February\": 28,\n    \"March\": 31,\n    \"April\": 30,\n    \"May\": 31,\n    \"June\": 30,\n    \"July\": 31,\n    \"August\": 31,\n    \"September\": 30,\n    \"October\": 31,\n    \"November\": 30,\n    \"December\": 31\n}\n'''\n````\n\n# Available Models\n\nBy 2024-07-25, the available models (may be outdated) are:\n\n| Model ID | Backend Model Name                 | Front Model Name         |\n| -------- | ---------------------------------- | ------------------------ |\n| 5        | claude-3-haiku-20240307            | Claude Mezzo             |\n| 6        | claude-3-opus-20240229             | Claude3 Opus             |\n| 7        | mistralai/mixtral-8x22b-instruct   | Mixtral Forte            |\n| 9        | gpt-4-turbo-2024-04-09             | GPT Mezzo                |\n| 10       | gpt-4-turbo-2024-04-09             | GPT Forte                |\n| 11       | dall-e-3                           | DALL \u00b7 E3                |\n| 12       | meta-llama/llama-3-70b-instruct    | Llama3 70B               |\n| 13       | google/gemini-pro-1.5              | Gemini 1.5 Pro           |\n| 14       | deepseek-chat                      | Deepseek                 |\n| 15       | google/gemini-flash-1.5            | Gemini-flash             |\n| 16       | gpt-4o-2024-05-13                  | GPT4o                    |\n| 17       | claude-3-opus-20240229             | Claude3 Opus(100K)       |\n| 18       | Stable Image Ultra                 | Stable Diffusion 3 Ultra |\n| 19       | claude-3-5-sonnet-20240620         | Claude 3.5 Sonnet        |\n| 20       | gpt-4o-mini-2024-07-18             | GPT4o-mini               |\n| 21       | meta-llama/llama-3.1-405b-instruct | Llama3.1 405B            |\n\n## Get real time available models\n\nDynamically retrieve the latest available models from the Juchats API.\n\n```python\nimport os\nfrom juchats.chat import Juchats\nfrom dotenv import load_dotenv\nimport asyncio\nload_dotenv()\ntoken = os.getenv('JTOKEN')\njuchats = Juchats(token, model='gpt-4o-2024-05-13')\nprint(\n    asyncio.run(juchats.get_models())\n)\n```\n\n# Note\n\n- **Streaming**: Set `show_stream=True` to display the chat response in real-time. Use `show_stream=False` to disable it.\n- **Model Selection**: Specify the backend model name with the `model` parameter. Refer to the available models table above for options.\n- **API Token**: Obtain your token from [Juchats](https://dlj.one/RNFYxz9) and use it to authenticate requests.\n- **Rate Limiting**: The API supports up to 3 queries per second (QPS). For higher limits, consider using the Deepseek API or OpenAI API.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": null,
    "version": "0.0.3",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9443d1c22b04c13f018e281b348a0488ac77f7207511eb2603d21c39c3a52f44",
                "md5": "92b92f469e581a92e2a51dfbb5362a91",
                "sha256": "9205df8063ebed7d244c1bf98ff7ee283ea1f2ed70efc4f644fe46b26988ffd5"
            },
            "downloads": -1,
            "filename": "juchats-0.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "92b92f469e581a92e2a51dfbb5362a91",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.8",
            "size": 7048,
            "upload_time": "2024-08-21T12:29:05",
            "upload_time_iso_8601": "2024-08-21T12:29:05.620342Z",
            "url": "https://files.pythonhosted.org/packages/94/43/d1c22b04c13f018e281b348a0488ac77f7207511eb2603d21c39c3a52f44/juchats-0.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8e5603dc019a6c1073507e6dd3cc774778889bc6fce18069043741b1f0d36e4b",
                "md5": "678800866ce0dc13685eaddfc6606a7b",
                "sha256": "77eeab2324f4b5fb4ce0c4d481b95c42cd7afc6867e529444382aa16f591f8c1"
            },
            "downloads": -1,
            "filename": "juchats-0.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "678800866ce0dc13685eaddfc6606a7b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.8",
            "size": 5853,
            "upload_time": "2024-08-21T12:29:06",
            "upload_time_iso_8601": "2024-08-21T12:29:06.886891Z",
            "url": "https://files.pythonhosted.org/packages/8e/56/03dc019a6c1073507e6dd3cc774778889bc6fce18069043741b1f0d36e4b/juchats-0.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-21 12:29:06",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "juchats"
}
        
Elapsed time: 0.51763s