ai-care


Nameai-care JSON
Version 0.1.3 PyPI version JSON
download
home_page
SummaryAI-Care endows AI with the capability to speak proactively. With simple settings, AI-Care allows your AI to proactively care for you.
upload_time2023-12-27 05:06:11
maintainer
docs_urlNone
author
requires_python>=3.10
licenseMIT License Copyright (c) 2023 Xueao Chao Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords ai proactive ai llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # AI-Care
![GitHub License](https://img.shields.io/github/license/happyapplehorse/ai-care)
![PyPI - Version](https://img.shields.io/pypi/v/ai-care)
![GitHub Workflow Status (with event)](https://img.shields.io/github/actions/workflow/status/happyapplehorse/ai-care/python-publish.yml?logo=pypi)
![GitHub Workflow Status (with event)](https://img.shields.io/github/actions/workflow/status/happyapplehorse/ai-care/codecov.yml?logo=pytest&label=test)
[![codecov](https://codecov.io/gh/happyapplehorse/ai-care/graph/badge.svg?token=G091OOWBDF)](https://codecov.io/gh/happyapplehorse/ai-care)
![Zero Dependencies](https://img.shields.io/badge/dependencies-zero-brightgreen)

Current AI models are only capable of passively responding to user inquiries
and lack the ability to initiate conversations.
AI-Care endows AI with the capability to speak proactively.
With simple settings, AI-Care allows your AI to proactively care for you.

## Highlights✨

1. 💵Low Cost: In terms of both token usage and API call frequencies, AI-Care is designed to minimize these expenses.
It operates with an O(1) cost complexity, which means that the costs do not increase with the duration of its activation.
2. 🕊️Low Intrusiveness: AI-Care provides its services alongside existing systems,
with virtually zero intrusion into the original code.
This allows for easy integration of AI-Care services into existing systems.
3. 🌍Model Universality: Compatible with all LLM (Large Language Model) models,
AI-Care does not rely on function call features or specific ways in which the model is used.

## Usage

1. Define the "to_llm" and "to_user" interfaces. AICare uses the "to_llm" interface to send
messages to the LLM and uses the "to_user" interface to send messages to the user.
```python
def to_llm_method(chat_context, to_llm_messages: list[AICareContext]) -> str | Generator[str, None, None]:
    # Here you need to convert `to_llm_messages` into the message format of the LLM you are using
    # and send the message to the LLM.
    # The return value is the message content replied by the LLM.
    # If you are not using stream mode, directly return the reply message string.
    # You can also use stream mode, in which case a string generator should be returned.
    # If using stream mode, AICare will also automatically use stream mode when sending messages to the user.
    pass

def to_user_method(to_user_message: str | Generator[str, None, None]) -> None:
    # Here you need to process messages from AICare as you would a normal LLM reply.
    # If using stream mode, this method should be able to receive and handle a string generator.
    # If the `to_llm` method uses stream mode, then this method should also use stream mode.
    pass
```

2. Instantiate AICare:
```python
from ai_care import AICare

ai_care = AICare()
```

3. Register "to_llm" and "to_user" methods:
```python
ai_care.register_to_llm_method(to_llm_method)
ai_care.register_to_user_method(to_user_method)
```

4. Using AICare
```python
# After each round of conversation or when AICare service is needed
ai_care.chat_update(chat_context)
```

## AI-Care settings
```python
# Set guidance information
ai_care.set_guide(guide="your guide")

# Set how long before AI-Care is activated
ai_care.set_config(key="delay", value=60)

# Set the number of times AI-Care selects “ASK_LATER”, setting it to 0 can disable this option.
ai_care.set_config(key="ask_later_count_limit", value=1)

# Set the system default recursive depth for ask.
ai_care.set_config(key="ask_depth", value=1)

# Set the maximum number of chat intervals the system automatically records.
ai_care.set_config(key="n_chat_intervals", value=20)
```

## License

This project is licensed under the [MIT License](./LICENSE).

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "ai-care",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "",
    "keywords": "AI,Proactive AI,LLM",
    "author": "",
    "author_email": "Xueao Chao <chaoxueao@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/ce/47/aae11624fb0aa463652cc7c3e4487e4fd7c5c86d9e4ad58bac4d6313c058/ai-care-0.1.3.tar.gz",
    "platform": null,
    "description": "# AI-Care\n![GitHub License](https://img.shields.io/github/license/happyapplehorse/ai-care)\n![PyPI - Version](https://img.shields.io/pypi/v/ai-care)\n![GitHub Workflow Status (with event)](https://img.shields.io/github/actions/workflow/status/happyapplehorse/ai-care/python-publish.yml?logo=pypi)\n![GitHub Workflow Status (with event)](https://img.shields.io/github/actions/workflow/status/happyapplehorse/ai-care/codecov.yml?logo=pytest&label=test)\n[![codecov](https://codecov.io/gh/happyapplehorse/ai-care/graph/badge.svg?token=G091OOWBDF)](https://codecov.io/gh/happyapplehorse/ai-care)\n![Zero Dependencies](https://img.shields.io/badge/dependencies-zero-brightgreen)\n\nCurrent AI models are only capable of passively responding to user inquiries\nand lack the ability to initiate conversations.\nAI-Care endows AI with the capability to speak proactively.\nWith simple settings, AI-Care allows your AI to proactively care for you.\n\n## Highlights\u2728\n\n1. \ud83d\udcb5Low Cost: In terms of both token usage and API call frequencies, AI-Care is designed to minimize these expenses.\nIt operates with an O(1) cost complexity, which means that the costs do not increase with the duration of its activation.\n2. \ud83d\udd4a\ufe0fLow Intrusiveness: AI-Care provides its services alongside existing systems,\nwith virtually zero intrusion into the original code.\nThis allows for easy integration of AI-Care services into existing systems.\n3. \ud83c\udf0dModel Universality: Compatible with all LLM (Large Language Model) models,\nAI-Care does not rely on function call features or specific ways in which the model is used.\n\n## Usage\n\n1. Define the \"to_llm\" and \"to_user\" interfaces. AICare uses the \"to_llm\" interface to send\nmessages to the LLM and uses the \"to_user\" interface to send messages to the user.\n```python\ndef to_llm_method(chat_context, to_llm_messages: list[AICareContext]) -> str | Generator[str, None, None]:\n    # Here you need to convert `to_llm_messages` into the message format of the LLM you are using\n    # and send the message to the LLM.\n    # The return value is the message content replied by the LLM.\n    # If you are not using stream mode, directly return the reply message string.\n    # You can also use stream mode, in which case a string generator should be returned.\n    # If using stream mode, AICare will also automatically use stream mode when sending messages to the user.\n    pass\n\ndef to_user_method(to_user_message: str | Generator[str, None, None]) -> None:\n    # Here you need to process messages from AICare as you would a normal LLM reply.\n    # If using stream mode, this method should be able to receive and handle a string generator.\n    # If the `to_llm` method uses stream mode, then this method should also use stream mode.\n    pass\n```\n\n2. Instantiate AICare:\n```python\nfrom ai_care import AICare\n\nai_care = AICare()\n```\n\n3. Register \"to_llm\" and \"to_user\" methods:\n```python\nai_care.register_to_llm_method(to_llm_method)\nai_care.register_to_user_method(to_user_method)\n```\n\n4. Using AICare\n```python\n# After each round of conversation or when AICare service is needed\nai_care.chat_update(chat_context)\n```\n\n## AI-Care settings\n```python\n# Set guidance information\nai_care.set_guide(guide=\"your guide\")\n\n# Set how long before AI-Care is activated\nai_care.set_config(key=\"delay\", value=60)\n\n# Set the number of times AI-Care selects \u201cASK_LATER\u201d, setting it to 0 can disable this option.\nai_care.set_config(key=\"ask_later_count_limit\", value=1)\n\n# Set the system default recursive depth for ask.\nai_care.set_config(key=\"ask_depth\", value=1)\n\n# Set the maximum number of chat intervals the system automatically records.\nai_care.set_config(key=\"n_chat_intervals\", value=20)\n```\n\n## License\n\nThis project is licensed under the [MIT License](./LICENSE).\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2023 Xueao Chao  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
    "summary": "AI-Care endows AI with the capability to speak proactively. With simple settings, AI-Care allows your AI to proactively care for you.",
    "version": "0.1.3",
    "project_urls": {
        "Homepage": "https://github.com/happyapplehorse/ai-care",
        "Issues": "https://github.com/happyapplehorse/ai-care/issues"
    },
    "split_keywords": [
        "ai",
        "proactive ai",
        "llm"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f134cd0915d06428734d2ba48a88fa1e536f41ab7ecff2c8b26560b87001a352",
                "md5": "efeddb34a2d729c44f43cdc0e4829074",
                "sha256": "eaf5ec5546aaeab45e349bd069f4234545137fcf2973d3d5eddfb21942194d71"
            },
            "downloads": -1,
            "filename": "ai_care-0.1.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "efeddb34a2d729c44f43cdc0e4829074",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 15528,
            "upload_time": "2023-12-27T05:06:09",
            "upload_time_iso_8601": "2023-12-27T05:06:09.586000Z",
            "url": "https://files.pythonhosted.org/packages/f1/34/cd0915d06428734d2ba48a88fa1e536f41ab7ecff2c8b26560b87001a352/ai_care-0.1.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ce47aae11624fb0aa463652cc7c3e4487e4fd7c5c86d9e4ad58bac4d6313c058",
                "md5": "c3f2c53ba90b81cda4ada2081e89162b",
                "sha256": "546dd8f12c8846ecccefdc409e11e72831e4fc4347e3a560c36b4dfaa702f781"
            },
            "downloads": -1,
            "filename": "ai-care-0.1.3.tar.gz",
            "has_sig": false,
            "md5_digest": "c3f2c53ba90b81cda4ada2081e89162b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 19317,
            "upload_time": "2023-12-27T05:06:11",
            "upload_time_iso_8601": "2023-12-27T05:06:11.146904Z",
            "url": "https://files.pythonhosted.org/packages/ce/47/aae11624fb0aa463652cc7c3e4487e4fd7c5c86d9e4ad58bac4d6313c058/ai-care-0.1.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-27 05:06:11",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "happyapplehorse",
    "github_project": "ai-care",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "ai-care"
}
        
Elapsed time: 0.23645s